binman support for replacing files

-----BEGIN PGP SIGNATURE-----
 
 iQEzBAABCgAdFiEEslwAIq+Gp8wWVbYnfxc6PpAIreYFAl0/Y4EACgkQfxc6PpAI
 reYhFwgAjqO+3dtXVPwo9vX/+9VzWUIMiBalYFZxSqFR2KDSkj+VjQOQCvyQRslv
 RydnyMgMaZbIXoPFU1Fa5Pz9zDDJ0DcG3LPIBi4/2ELH/ezwv1IdL73Ol7ulA+G7
 Z3d7fwa4RWiF35z5Rl/ck1eqOAazZepsmiVQfLKX/KlHz24+nGyWKf49JAqWcfu1
 bAuwKpwg/FIX8ni8JbL2CdNBK4RXAkec3TCcbanT+LlDF6gKyrx0XDv4M+9bsKD4
 REDXOAiQTujM4v5j4VNL97JFg/l68nhMn0CZAJQQ4zS65bnUXEvuCFvqEXD7pGV9
 3Fsw0VroncfvGHnvd3NoNpZ5TfkqDQ==
 =q4Cl
 -----END PGP SIGNATURE-----

Merge tag 'dm-pull-29jul19' of https://gitlab.denx.de/u-boot/custodians/u-boot-dm

binman support for replacing files
This commit is contained in:
Tom Rini 2019-07-29 17:59:51 -04:00
commit de17e1fc03
49 changed files with 1980 additions and 363 deletions

View file

@ -1189,9 +1189,10 @@ u-boot.ldr: u-boot
# ---------------------------------------------------------------------------
# Use 'make BINMAN_DEBUG=1' to enable debugging
quiet_cmd_binman = BINMAN $@
cmd_binman = $(srctree)/tools/binman/binman build -u -d u-boot.dtb -O . -m \
cmd_binman = $(srctree)/tools/binman/binman $(if $(BINMAN_DEBUG),-D) \
build -u -d u-boot.dtb -O . -m \
-I . -I $(srctree) -I $(srctree)/board/$(BOARDDIR) \
$(if $(BINMAN_DEBUG),-D) $(BINMAN_$(@F))
$(BINMAN_$(@F))
OBJCOPYFLAGS_u-boot.ldr.hex := -I binary -O ihex

View file

@ -114,7 +114,7 @@ int fdt_resize(void *fdt, void *buf, int bufsize)
FDT_SW_CHECK_HEADER(fdt);
headsize = fdt_off_dt_struct(fdt);
headsize = fdt_off_dt_struct(fdt) + fdt_size_dt_struct(fdt);
tailsize = fdt_size_dt_strings(fdt);
if ((headsize + tailsize) > bufsize)

View file

@ -481,6 +481,29 @@ name-prefix:
distinguish binaries with otherwise identical names.
Image Properties
----------------
Image nodes act like sections but also have a few extra properties:
filename:
Output filename for the image. This defaults to image.bin (or in the
case of multiple images <nodename>.bin where <nodename> is the name of
the image node.
allow-repack:
Create an image that can be repacked. With this option it is possible
to change anything in the image after it is created, including updating
the position and size of image components. By default this is not
permitted since it is not possibly to know whether this might violate a
constraint in the image description. For example, if a section has to
increase in size to hold a larger binary, that might cause the section
to fall out of its allow region (e.g. read-only portion of flash).
Adding this property causes the original offset and size values in the
image description to be stored in the FDT and fdtmap.
Entry Documentation
-------------------
@ -557,6 +580,35 @@ or just a selection:
$ binman extract -i image.bin "*u-boot*" -O outdir
Replacing files in an image
---------------------------
You can replace files in an existing firmware image created by binman, provided
that there is an 'fdtmap' entry in the image. For example:
$ binman replace -i image.bin section/cbfs/u-boot
which will write the contents of the file 'u-boot' from the current directory
to the that entry, compressing if necessary. If the entry size changes, you must
add the 'allow-repack' property to the original image before generating it (see
above), otherwise you will get an error.
You can also use a particular file, in this case u-boot.bin:
$ binman replace -i image.bin section/cbfs/u-boot -f u-boot.bin
It is possible to replace all files from a source directory which uses the same
hierarchy as the entries:
$ binman replace -i image.bin -I indir
Files that are missing will generate a warning.
You can also replace just a selection of entries:
$ binman replace -i image.bin "*u-boot*" -I indir
Logging
-------
@ -644,22 +696,35 @@ large enough to hold all the entries.
7. CheckEntries() - checks that the entries do not overlap, nor extend
outside the image.
8. SetCalculatedProperties() - update any calculated properties in the device
8. SetImagePos() - sets the image position of every entry. This is the absolute
position 'image-pos', as opposed to 'offset' which is relative to the containing
section. This must be done after all offsets are known, which is why it is quite
late in the ordering.
9. SetCalculatedProperties() - update any calculated properties in the device
tree. This sets the correct 'offset' and 'size' vaues, for example.
9. ProcessEntryContents() - this calls Entry.ProcessContents() on each entry.
10. ProcessEntryContents() - this calls Entry.ProcessContents() on each entry.
The default implementatoin does nothing. This can be overriden to adjust the
contents of an entry in some way. For example, it would be possible to create
an entry containing a hash of the contents of some other entries. At this
stage the offset and size of entries should not be adjusted unless absolutely
necessary, since it requires a repack (going back to PackEntries()).
10. WriteSymbols() - write the value of symbols into the U-Boot SPL binary.
11. ResetForPack() - if the ProcessEntryContents() step failed, in that an entry
has changed its size, then there is no alternative but to go back to step 5 and
try again, repacking the entries with the updated size. ResetForPack() removes
the fixed offset/size values added by binman, so that the packing can start from
scratch.
12. WriteSymbols() - write the value of symbols into the U-Boot SPL binary.
See 'Access to binman entry offsets at run time' below for a description of
what happens in this stage.
11. BuildImage() - builds the image and writes it to a file. This is the final
step.
13. BuildImage() - builds the image and writes it to a file
14. WriteMap() - writes a text file containing a map of the image. This is the
final step.
Automatic .dtsi inclusion
@ -909,10 +974,10 @@ Some ideas:
- Allow easy building of images by specifying just the board name
- Support building an image for a board (-b) more completely, with a
configurable build directory
- Support updating binaries in an image (with no size change / repacking)
- Support updating binaries in an image (with repacking)
- Support adding FITs to an image
- Support for ARM Trusted Firmware (ATF)
- Detect invalid properties in nodes
- Sort the fdtmap by offset
--
Simon Glass <sjg@chromium.org>

View file

@ -230,7 +230,9 @@ Properties / Entry arguments:
None
An FDT map is just a header followed by an FDT containing a list of all the
entries in the image.
entries in the image. The root node corresponds to the image node in the
original FDT, and an image-name property indicates the image name in that
original tree.
The header is the string _FDTMAP_ followed by 8 unused bytes.
@ -244,6 +246,7 @@ FDT with the position of each entry.
Example output for a simple image with U-Boot and an FDT map:
/ {
image-name = "binman";
size = <0x00000112>;
image-pos = <0x00000000>;
offset = <0x00000000>;
@ -259,6 +262,9 @@ Example output for a simple image with U-Boot and an FDT map:
};
};
If allow-repack is used then 'orig-offset' and 'orig-size' properties are
added as necessary. See the binman README.
Entry: files: Entry containing a set of files
@ -308,7 +314,8 @@ see www.flashrom.org/Flashrom for more information.
When used, this entry will be populated with an FMAP which reflects the
entries in the current image. Note that any hierarchy is squashed, since
FMAP does not support this.
FMAP does not support this. Also, CBFS entries appear as a single entry -
the sub-entries are ignored.

View file

@ -208,6 +208,7 @@ class CbfsFile(object):
cbfs_offset: Offset of file data in bytes from start of CBFS, or None to
place this file anyway
data: Contents of file, uncompressed
orig_data: Original data added to the file, possibly compressed
data_len: Length of (possibly compressed) data in bytes
ftype: File type (TYPE_...)
compression: Compression type (COMPRESS_...)
@ -226,6 +227,7 @@ class CbfsFile(object):
self.offset = None
self.cbfs_offset = cbfs_offset
self.data = data
self.orig_data = data
self.ftype = ftype
self.compress = compress
self.memlen = None
@ -240,9 +242,9 @@ class CbfsFile(object):
"""Handle decompressing data if necessary"""
indata = self.data
if self.compress == COMPRESS_LZ4:
data = tools.Decompress(indata, 'lz4')
data = tools.Decompress(indata, 'lz4', with_header=False)
elif self.compress == COMPRESS_LZMA:
data = tools.Decompress(indata, 'lzma')
data = tools.Decompress(indata, 'lzma', with_header=False)
else:
data = indata
self.memlen = len(data)
@ -361,9 +363,9 @@ class CbfsFile(object):
elif self.ftype == TYPE_RAW:
orig_data = data
if self.compress == COMPRESS_LZ4:
data = tools.Compress(orig_data, 'lz4')
data = tools.Compress(orig_data, 'lz4', with_header=False)
elif self.compress == COMPRESS_LZMA:
data = tools.Compress(orig_data, 'lzma')
data = tools.Compress(orig_data, 'lzma', with_header=False)
self.memlen = len(orig_data)
self.data_len = len(data)
attr = struct.pack(ATTR_COMPRESSION_FORMAT,
@ -715,7 +717,7 @@ class CbfsReader(object):
file_pos = fd.tell()
data = fd.read(FILE_HEADER_LEN)
if len(data) < FILE_HEADER_LEN:
print('File header at %x ran out of data' % file_pos)
print('File header at %#x ran out of data' % file_pos)
return False
magic, size, ftype, attr, offset = struct.unpack(FILE_HEADER_FORMAT,
data)
@ -724,7 +726,7 @@ class CbfsReader(object):
pos = fd.tell()
name = self._read_string(fd)
if name is None:
print('String at %x ran out of data' % pos)
print('String at %#x ran out of data' % pos)
return False
if DEBUG:

View file

@ -372,7 +372,7 @@ class TestCbfs(unittest.TestCase):
with io.BytesIO(newdata) as fd:
fd.seek(pos)
self.assertEqual(False, cbr._read_next_file(fd))
self.assertIn('File header at 0 ran out of data', stdout.getvalue())
self.assertIn('File header at 0x0 ran out of data', stdout.getvalue())
def test_cbfs_bad_file_string(self):
"""Check handling of an incomplete filename string"""
@ -394,7 +394,7 @@ class TestCbfs(unittest.TestCase):
with io.BytesIO(newdata) as fd:
fd.seek(pos)
self.assertEqual(False, cbr._read_next_file(fd))
self.assertIn('String at %x ran out of data' %
self.assertIn('String at %#x ran out of data' %
cbfs_util.FILE_HEADER_LEN, stdout.getvalue())
def test_cbfs_debug(self):

View file

@ -84,6 +84,23 @@ controlled by a description in the board device tree.'''
extract_parser.add_argument('-U', '--uncompressed', action='store_true',
help='Output raw uncompressed data for compressed entries')
replace_parser = subparsers.add_parser('replace',
help='Replace entries in an image')
replace_parser.add_argument('-C', '--compressed', action='store_true',
help='Input data is already compressed if needed for the entry')
replace_parser.add_argument('-i', '--image', type=str, required=True,
help='Image filename to extract')
replace_parser.add_argument('-f', '--filename', type=str,
help='Input filename to read from')
replace_parser.add_argument('-F', '--fix-size', action='store_true',
help="Don't allow entries to be resized")
replace_parser.add_argument('-I', '--indir', type=str, default='',
help='Path to directory to use for input files')
replace_parser.add_argument('-m', '--map', action='store_true',
default=False, help='Output a map file for the updated image')
replace_parser.add_argument('paths', type=str, nargs='*',
help='Paths within file to extract (wildcard)')
test_parser = subparsers.add_parser('test', help='Run tests')
test_parser.add_argument('-P', '--processes', type=int,
help='set number of processes to use for running tests')

View file

@ -128,7 +128,7 @@ def ExtractEntries(image_fname, output_fname, outdir, entry_paths,
otherwise
outdir: Output directory to use (for any number of files), else None
entry_paths: List of entry paths to extract
decomp: True to compress the entry data
decomp: True to decompress the entry data
Returns:
List of EntryInfo records that were written
@ -138,9 +138,9 @@ def ExtractEntries(image_fname, output_fname, outdir, entry_paths,
# Output an entry to a single file, as a special case
if output_fname:
if not entry_paths:
raise ValueError('Must specify an entry path to write with -o')
raise ValueError('Must specify an entry path to write with -f')
if len(entry_paths) != 1:
raise ValueError('Must specify exactly one entry path to write with -o')
raise ValueError('Must specify exactly one entry path to write with -f')
entry = image.FindEntryPath(entry_paths[0])
data = entry.ReadData(decomp)
tools.WriteFile(output_fname, data)
@ -169,6 +169,287 @@ def ExtractEntries(image_fname, output_fname, outdir, entry_paths,
return einfos
def BeforeReplace(image, allow_resize):
"""Handle getting an image ready for replacing entries in it
Args:
image: Image to prepare
"""
state.PrepareFromLoadedData(image)
image.LoadData()
# If repacking, drop the old offset/size values except for the original
# ones, so we are only left with the constraints.
if allow_resize:
image.ResetForPack()
def ReplaceOneEntry(image, entry, data, do_compress, allow_resize):
"""Handle replacing a single entry an an image
Args:
image: Image to update
entry: Entry to write
data: Data to replace with
do_compress: True to compress the data if needed, False if data is
already compressed so should be used as is
allow_resize: True to allow entries to change size (this does a re-pack
of the entries), False to raise an exception
"""
if not entry.WriteData(data, do_compress):
if not image.allow_repack:
entry.Raise('Entry data size does not match, but allow-repack is not present for this image')
if not allow_resize:
entry.Raise('Entry data size does not match, but resize is disabled')
def AfterReplace(image, allow_resize, write_map):
"""Handle write out an image after replacing entries in it
Args:
image: Image to write
allow_resize: True to allow entries to change size (this does a re-pack
of the entries), False to raise an exception
write_map: True to write a map file
"""
tout.Info('Processing image')
ProcessImage(image, update_fdt=True, write_map=write_map,
get_contents=False, allow_resize=allow_resize)
def WriteEntryToImage(image, entry, data, do_compress=True, allow_resize=True,
write_map=False):
BeforeReplace(image, allow_resize)
tout.Info('Writing data to %s' % entry.GetPath())
ReplaceOneEntry(image, entry, data, do_compress, allow_resize)
AfterReplace(image, allow_resize=allow_resize, write_map=write_map)
def WriteEntry(image_fname, entry_path, data, do_compress=True,
allow_resize=True, write_map=False):
"""Replace an entry in an image
This replaces the data in a particular entry in an image. This size of the
new data must match the size of the old data unless allow_resize is True.
Args:
image_fname: Image filename to process
entry_path: Path to entry to extract
data: Data to replace with
do_compress: True to compress the data if needed, False if data is
already compressed so should be used as is
allow_resize: True to allow entries to change size (this does a re-pack
of the entries), False to raise an exception
write_map: True to write a map file
Returns:
Image object that was updated
"""
tout.Info("Write entry '%s', file '%s'" % (entry_path, image_fname))
image = Image.FromFile(image_fname)
entry = image.FindEntryPath(entry_path)
WriteEntryToImage(image, entry, data, do_compress=do_compress,
allow_resize=allow_resize, write_map=write_map)
return image
def ReplaceEntries(image_fname, input_fname, indir, entry_paths,
do_compress=True, allow_resize=True, write_map=False):
"""Replace the data from one or more entries from input files
Args:
image_fname: Image filename to process
input_fname: Single input ilename to use if replacing one file, None
otherwise
indir: Input directory to use (for any number of files), else None
entry_paths: List of entry paths to extract
do_compress: True if the input data is uncompressed and may need to be
compressed if the entry requires it, False if the data is already
compressed.
write_map: True to write a map file
Returns:
List of EntryInfo records that were written
"""
image = Image.FromFile(image_fname)
# Replace an entry from a single file, as a special case
if input_fname:
if not entry_paths:
raise ValueError('Must specify an entry path to read with -f')
if len(entry_paths) != 1:
raise ValueError('Must specify exactly one entry path to write with -f')
entry = image.FindEntryPath(entry_paths[0])
data = tools.ReadFile(input_fname)
tout.Notice("Read %#x bytes from file '%s'" % (len(data), input_fname))
WriteEntryToImage(image, entry, data, do_compress=do_compress,
allow_resize=allow_resize, write_map=write_map)
return
# Otherwise we will input from a path given by the entry path of each entry.
# This means that files must appear in subdirectories if they are part of
# a sub-section.
einfos = image.GetListEntries(entry_paths)[0]
tout.Notice("Replacing %d matching entries in image '%s'" %
(len(einfos), image_fname))
BeforeReplace(image, allow_resize)
for einfo in einfos:
entry = einfo.entry
if entry.GetEntries():
tout.Info("Skipping section entry '%s'" % entry.GetPath())
continue
path = entry.GetPath()[1:]
fname = os.path.join(indir, path)
if os.path.exists(fname):
tout.Notice("Write entry '%s' from file '%s'" %
(entry.GetPath(), fname))
data = tools.ReadFile(fname)
ReplaceOneEntry(image, entry, data, do_compress, allow_resize)
else:
tout.Warning("Skipping entry '%s' from missing file '%s'" %
(entry.GetPath(), fname))
AfterReplace(image, allow_resize=allow_resize, write_map=write_map)
return image
def PrepareImagesAndDtbs(dtb_fname, select_images, update_fdt):
"""Prepare the images to be processed and select the device tree
This function:
- reads in the device tree
- finds and scans the binman node to create all entries
- selects which images to build
- Updates the device tress with placeholder properties for offset,
image-pos, etc.
Args:
dtb_fname: Filename of the device tree file to use (.dts or .dtb)
selected_images: List of images to output, or None for all
update_fdt: True to update the FDT wth entry offsets, etc.
"""
# Import these here in case libfdt.py is not available, in which case
# the above help option still works.
import fdt
import fdt_util
global images
# Get the device tree ready by compiling it and copying the compiled
# output into a file in our output directly. Then scan it for use
# in binman.
dtb_fname = fdt_util.EnsureCompiled(dtb_fname)
fname = tools.GetOutputFilename('u-boot.dtb.out')
tools.WriteFile(fname, tools.ReadFile(dtb_fname))
dtb = fdt.FdtScan(fname)
node = _FindBinmanNode(dtb)
if not node:
raise ValueError("Device tree '%s' does not have a 'binman' "
"node" % dtb_fname)
images = _ReadImageDesc(node)
if select_images:
skip = []
new_images = OrderedDict()
for name, image in images.items():
if name in select_images:
new_images[name] = image
else:
skip.append(name)
images = new_images
tout.Notice('Skipping images: %s' % ', '.join(skip))
state.Prepare(images, dtb)
# Prepare the device tree by making sure that any missing
# properties are added (e.g. 'pos' and 'size'). The values of these
# may not be correct yet, but we add placeholders so that the
# size of the device tree is correct. Later, in
# SetCalculatedProperties() we will insert the correct values
# without changing the device-tree size, thus ensuring that our
# entry offsets remain the same.
for image in images.values():
image.ExpandEntries()
if update_fdt:
image.AddMissingProperties()
image.ProcessFdt(dtb)
for dtb_item in state.GetAllFdts():
dtb_item.Sync(auto_resize=True)
dtb_item.Pack()
dtb_item.Flush()
return images
def ProcessImage(image, update_fdt, write_map, get_contents=True,
allow_resize=True):
"""Perform all steps for this image, including checking and # writing it.
This means that errors found with a later image will be reported after
earlier images are already completed and written, but that does not seem
important.
Args:
image: Image to process
update_fdt: True to update the FDT wth entry offsets, etc.
write_map: True to write a map file
get_contents: True to get the image contents from files, etc., False if
the contents is already present
allow_resize: True to allow entries to change size (this does a re-pack
of the entries), False to raise an exception
"""
if get_contents:
image.GetEntryContents()
image.GetEntryOffsets()
# We need to pack the entries to figure out where everything
# should be placed. This sets the offset/size of each entry.
# However, after packing we call ProcessEntryContents() which
# may result in an entry changing size. In that case we need to
# do another pass. Since the device tree often contains the
# final offset/size information we try to make space for this in
# AddMissingProperties() above. However, if the device is
# compressed we cannot know this compressed size in advance,
# since changing an offset from 0x100 to 0x104 (for example) can
# alter the compressed size of the device tree. So we need a
# third pass for this.
passes = 5
for pack_pass in range(passes):
try:
image.PackEntries()
image.CheckSize()
image.CheckEntries()
except Exception as e:
if write_map:
fname = image.WriteMap()
print("Wrote map file '%s' to show errors" % fname)
raise
image.SetImagePos()
if update_fdt:
image.SetCalculatedProperties()
for dtb_item in state.GetAllFdts():
dtb_item.Sync()
dtb_item.Flush()
sizes_ok = image.ProcessEntryContents()
if sizes_ok:
break
image.ResetForPack()
if not sizes_ok:
image.Raise('Entries changed size after packing (tried %s passes)' %
passes)
image.WriteSymbols()
image.BuildImage()
if write_map:
image.WriteMap()
def Binman(args):
"""The main control code for binman
@ -178,8 +459,6 @@ def Binman(args):
Args:
args: Command line arguments Namespace object
"""
global images
if args.full_help:
pager = os.getenv('PAGER')
if not pager:
@ -190,7 +469,11 @@ def Binman(args):
return 0
if args.cmd == 'ls':
ListEntries(args.image, args.paths)
try:
tools.PrepareOutputDir(None)
ListEntries(args.image, args.paths)
finally:
tools.FinaliseOutputDir()
return 0
if args.cmd == 'extract':
@ -202,6 +485,16 @@ def Binman(args):
tools.FinaliseOutputDir()
return 0
if args.cmd == 'replace':
try:
tools.PrepareOutputDir(None)
ReplaceEntries(args.image, args.filename, args.indir, args.paths,
do_compress=not args.compressed,
allow_resize=not args.fix_size, write_map=args.map)
finally:
tools.FinaliseOutputDir()
return 0
# Try to figure out which device tree contains our image description
if args.dt:
dtb_fname = args.dt
@ -216,11 +509,6 @@ def Binman(args):
args.indir.append(board_pathname)
try:
# Import these here in case libfdt.py is not available, in which case
# the above help option still works.
import fdt
import fdt_util
tout.Init(args.verbosity)
elf.debug = args.debug
cbfs_util.VERBOSE = args.verbosity > 2
@ -231,103 +519,13 @@ def Binman(args):
tools.SetToolPaths(args.toolpath)
state.SetEntryArgs(args.entry_arg)
# Get the device tree ready by compiling it and copying the compiled
# output into a file in our output directly. Then scan it for use
# in binman.
dtb_fname = fdt_util.EnsureCompiled(dtb_fname)
fname = tools.GetOutputFilename('u-boot.dtb.out')
tools.WriteFile(fname, tools.ReadFile(dtb_fname))
dtb = fdt.FdtScan(fname)
node = _FindBinmanNode(dtb)
if not node:
raise ValueError("Device tree '%s' does not have a 'binman' "
"node" % dtb_fname)
images = _ReadImageDesc(node)
if args.image:
skip = []
new_images = OrderedDict()
for name, image in images.items():
if name in args.image:
new_images[name] = image
else:
skip.append(name)
images = new_images
if skip and args.verbosity >= 2:
print('Skipping images: %s' % ', '.join(skip))
state.Prepare(images, dtb)
# Prepare the device tree by making sure that any missing
# properties are added (e.g. 'pos' and 'size'). The values of these
# may not be correct yet, but we add placeholders so that the
# size of the device tree is correct. Later, in
# SetCalculatedProperties() we will insert the correct values
# without changing the device-tree size, thus ensuring that our
# entry offsets remain the same.
images = PrepareImagesAndDtbs(dtb_fname, args.image,
args.update_fdt)
for image in images.values():
image.ExpandEntries()
if args.update_fdt:
image.AddMissingProperties()
image.ProcessFdt(dtb)
for dtb_item in state.GetFdts():
dtb_item.Sync(auto_resize=True)
dtb_item.Pack()
dtb_item.Flush()
for image in images.values():
# Perform all steps for this image, including checking and
# writing it. This means that errors found with a later
# image will be reported after earlier images are already
# completed and written, but that does not seem important.
image.GetEntryContents()
image.GetEntryOffsets()
# We need to pack the entries to figure out where everything
# should be placed. This sets the offset/size of each entry.
# However, after packing we call ProcessEntryContents() which
# may result in an entry changing size. In that case we need to
# do another pass. Since the device tree often contains the
# final offset/size information we try to make space for this in
# AddMissingProperties() above. However, if the device is
# compressed we cannot know this compressed size in advance,
# since changing an offset from 0x100 to 0x104 (for example) can
# alter the compressed size of the device tree. So we need a
# third pass for this.
passes = 3
for pack_pass in range(passes):
try:
image.PackEntries()
image.CheckSize()
image.CheckEntries()
except Exception as e:
if args.map:
fname = image.WriteMap()
print("Wrote map file '%s' to show errors" % fname)
raise
image.SetImagePos()
if args.update_fdt:
image.SetCalculatedProperties()
for dtb_item in state.GetFdts():
dtb_item.Sync()
sizes_ok = image.ProcessEntryContents()
if sizes_ok:
break
image.ResetForPack()
if not sizes_ok:
image.Raise('Entries expanded after packing (tried %s passes)' %
passes)
image.WriteSymbols()
image.BuildImage()
if args.map:
image.WriteMap()
ProcessImage(image, args.update_fdt, args.map)
# Write the updated FDTs to our output files
for dtb_item in state.GetFdts():
for dtb_item in state.GetAllFdts():
tools.WriteFile(dtb_item._fname, dtb_item.GetContents())
finally:

View file

@ -17,6 +17,7 @@ import struct
import tempfile
import tools
import tout
ELF_TOOLS = True
try:
@ -25,9 +26,6 @@ try:
except: # pragma: no cover
ELF_TOOLS = False
# This is enabled from control.py
debug = False
Symbol = namedtuple('Symbol', ['section', 'address', 'size', 'weak'])
# Information about an ELF file:
@ -143,9 +141,8 @@ def LookupAndWriteSymbols(elf_fname, entry, section):
value = -1
pack_string = pack_string.lower()
value_bytes = struct.pack(pack_string, value)
if debug:
print('%s:\n insert %s, offset %x, value %x, length %d' %
(msg, name, offset, value, len(value_bytes)))
tout.Debug('%s:\n insert %s, offset %x, value %x, length %d' %
(msg, name, offset, value, len(value_bytes)))
entry.data = (entry.data[:offset] + value_bytes +
entry.data[offset + sym.size:])

View file

@ -14,6 +14,7 @@ import command
import elf
import test_util
import tools
import tout
binman_dir = os.path.dirname(os.path.realpath(sys.argv[0]))
@ -130,14 +131,16 @@ class TestElf(unittest.TestCase):
def testDebug(self):
"""Check that enabling debug in the elf module produced debug output"""
elf.debug = True
entry = FakeEntry(20)
section = FakeSection()
elf_fname = os.path.join(binman_dir, 'test', 'u_boot_binman_syms')
with test_util.capture_sys_output() as (stdout, stderr):
syms = elf.LookupAndWriteSymbols(elf_fname, entry, section)
elf.debug = False
self.assertTrue(len(stdout.getvalue()) > 0)
try:
tout.Init(tout.DEBUG)
entry = FakeEntry(20)
section = FakeSection()
elf_fname = os.path.join(binman_dir, 'test', 'u_boot_binman_syms')
with test_util.capture_sys_output() as (stdout, stderr):
syms = elf.LookupAndWriteSymbols(elf_fname, entry, section)
self.assertTrue(len(stdout.getvalue()) > 0)
finally:
tout.Init(tout.WARNING)
def testMakeElf(self):
"""Test for the MakeElf function"""

View file

@ -23,6 +23,7 @@ import sys
import fdt_util
import state
import tools
from tools import ToHex, ToHexSize
import tout
modules = {}
@ -69,7 +70,7 @@ class Entry(object):
orig_offset: Original offset value read from node
orig_size: Original size value read from node
"""
def __init__(self, section, etype, node, read_node=True, name_prefix=''):
def __init__(self, section, etype, node, name_prefix=''):
self.section = section
self.etype = etype
self._node = node
@ -88,8 +89,6 @@ class Entry(object):
self.image_pos = None
self._expand_size = False
self.compress = 'none'
if read_node:
self.ReadNode()
@staticmethod
def Lookup(node_path, etype):
@ -154,14 +153,19 @@ class Entry(object):
def ReadNode(self):
"""Read entry information from the node
This must be called as the first thing after the Entry is created.
This reads all the fields we recognise from the node, ready for use.
"""
if 'pos' in self._node.props:
self.Raise("Please use 'offset' instead of 'pos'")
self.offset = fdt_util.GetInt(self._node, 'offset')
self.size = fdt_util.GetInt(self._node, 'size')
self.orig_offset = self.offset
self.orig_size = self.size
self.orig_offset = fdt_util.GetInt(self._node, 'orig-offset')
self.orig_size = fdt_util.GetInt(self._node, 'orig-size')
if self.GetImage().copy_to_orig:
self.orig_offset = self.offset
self.orig_size = self.size
# These should not be set in input files, but are set in an FDT map,
# which is also read by this code.
@ -185,19 +189,18 @@ class Entry(object):
def GetDefaultFilename(self):
return None
def GetFdtSet(self):
"""Get the set of device trees used by this entry
def GetFdts(self):
"""Get the device trees used by this entry
Returns:
Set containing the filename from this entry, if it is a .dtb, else
an empty set
Empty dict, if this entry is not a .dtb, otherwise:
Dict:
key: Filename from this entry (without the path)
value: Tuple:
Fdt object for this dtb, or None if not available
Filename of file containing this dtb
"""
fname = self.GetDefaultFilename()
# It would be better to use isinstance(self, Entry_blob_dtb) here but
# we cannot access Entry_blob_dtb
if fname and fname.endswith('.dtb'):
return set([fname])
return set()
return {}
def ExpandEntries(self):
pass
@ -207,6 +210,12 @@ class Entry(object):
for prop in ['offset', 'size', 'image-pos']:
if not prop in self._node.props:
state.AddZeroProp(self._node, prop)
if self.GetImage().allow_repack:
if self.orig_offset is not None:
state.AddZeroProp(self._node, 'orig-offset', True)
if self.orig_size is not None:
state.AddZeroProp(self._node, 'orig-size', True)
if self.compress != 'none':
state.AddZeroProp(self._node, 'uncomp-size')
err = state.CheckAddHashProp(self._node)
@ -219,6 +228,11 @@ class Entry(object):
state.SetInt(self._node, 'size', self.size)
base = self.section.GetRootSkipAtStart() if self.section else 0
state.SetInt(self._node, 'image-pos', self.image_pos - base)
if self.GetImage().allow_repack:
if self.orig_offset is not None:
state.SetInt(self._node, 'orig-offset', self.orig_offset, True)
if self.orig_size is not None:
state.SetInt(self._node, 'orig-size', self.orig_size, True)
if self.uncomp_size is not None:
state.SetInt(self._node, 'uncomp-size', self.uncomp_size)
state.CheckSetHashValue(self._node, self.GetData)
@ -271,15 +285,26 @@ class Entry(object):
"""
size_ok = True
new_size = len(data)
if state.AllowEntryExpansion():
if state.AllowEntryExpansion() and new_size > self.contents_size:
# self.data will indicate the new size needed
size_ok = False
elif state.AllowEntryContraction() and new_size < self.contents_size:
size_ok = False
# If not allowed to change, try to deal with it or give up
if size_ok:
if new_size > self.contents_size:
tout.Debug("Entry '%s' size change from %#x to %#x" % (
self._node.path, self.contents_size, new_size))
# self.data will indicate the new size needed
size_ok = False
elif new_size != self.contents_size:
self.Raise('Cannot update entry size from %d to %d' %
(self.contents_size, new_size))
self.Raise('Cannot update entry size from %d to %d' %
(self.contents_size, new_size))
# Don't let the data shrink. Pad it if necessary
if size_ok and new_size < self.contents_size:
data += tools.GetBytes(0, self.contents_size - new_size)
if not size_ok:
tout.Debug("Entry '%s' size change from %s to %s" % (
self._node.path, ToHex(self.contents_size),
ToHex(new_size)))
self.SetContents(data)
return size_ok
@ -295,6 +320,9 @@ class Entry(object):
def ResetForPack(self):
"""Reset offset/size fields so that packing can be done again"""
self.Detail('ResetForPack: offset %s->%s, size %s->%s' %
(ToHex(self.offset), ToHex(self.orig_offset),
ToHex(self.size), ToHex(self.orig_size)))
self.offset = self.orig_offset
self.size = self.orig_size
@ -316,6 +344,9 @@ class Entry(object):
Returns:
New section offset pointer (after this entry)
"""
self.Detail('Packing: offset=%s, size=%s, content_size=%x' %
(ToHex(self.offset), ToHex(self.size),
self.contents_size))
if self.offset is None:
if self.offset_unset:
self.Raise('No offset set with offset-unset: should another '
@ -347,6 +378,8 @@ class Entry(object):
if self.offset != tools.Align(self.offset, self.align):
self.Raise("Offset %#x (%d) does not match align %#x (%d)" %
(self.offset, self.offset, self.align, self.align))
self.Detail(' - packed: offset=%#x, size=%#x, content_size=%#x, next_offset=%x' %
(self.offset, self.size, self.contents_size, new_offset))
return new_offset
@ -354,6 +387,11 @@ class Entry(object):
"""Convenience function to raise an error referencing a node"""
raise ValueError("Node '%s': %s" % (self._node.path, msg))
def Detail(self, msg):
"""Convenience function to log detail referencing a node"""
tag = "Node '%s'" % self._node.path
tout.Detail('%30s: %s' % (tag, msg))
def GetEntryArgsOrProps(self, props, required=False):
"""Return the values of a set of properties
@ -390,6 +428,7 @@ class Entry(object):
return self._node.path
def GetData(self):
self.Detail('GetData: size %s' % ToHexSize(self.data))
return self.data
def GetOffsets(self):
@ -675,8 +714,75 @@ features to produce new behaviours.
"""
# Use True here so that we get an uncompressed section to work from,
# although compressed sections are currently not supported
data = self.section.ReadData(True)
tout.Info('%s: Reading data from offset %#x-%#x, size %#x (avail %#x)' %
(self.GetPath(), self.offset, self.offset + self.size,
self.size, len(data)))
return data[self.offset:self.offset + self.size]
data = self.section.ReadChildData(self, decomp)
return data
def LoadData(self, decomp=True):
data = self.ReadData(decomp)
self.contents_size = len(data)
self.ProcessContentsUpdate(data)
self.Detail('Loaded data size %x' % len(data))
def GetImage(self):
"""Get the image containing this entry
Returns:
Image object containing this entry
"""
return self.section.GetImage()
def WriteData(self, data, decomp=True):
"""Write the data to an entry in the image
This is used when the image has been read in and we want to replace the
data for a particular entry in that image.
The image must be re-packed and written out afterwards.
Args:
data: Data to replace it with
decomp: True to compress the data if needed, False if data is
already compressed so should be used as is
Returns:
True if the data did not result in a resize of this entry, False if
the entry must be resized
"""
self.contents_size = self.size
ok = self.ProcessContentsUpdate(data)
self.Detail('WriteData: size=%x, ok=%s' % (len(data), ok))
section_ok = self.section.WriteChildData(self)
return ok and section_ok
def WriteChildData(self, child):
"""Handle writing the data in a child entry
This should be called on the child's parent section after the child's
data has been updated. It
This base-class implementation does nothing, since the base Entry object
does not have any children.
Args:
child: Child Entry that was written
Returns:
True if the section could be updated successfully, False if the
data is such that the section could not updat
"""
return True
def GetSiblingOrder(self):
"""Get the relative order of an entry amoung its siblings
Returns:
'start' if this entry is first among siblings, 'end' if last,
otherwise None
"""
entries = list(self.section.GetEntries().values())
if entries:
if self == entries[0]:
return 'start'
elif self == entries[-1]:
return 'end'
return 'middle'

View file

@ -57,7 +57,7 @@ class TestEntry(unittest.TestCase):
def testEntryContents(self):
"""Test the Entry bass class"""
import entry
base_entry = entry.Entry(None, None, None, read_node=False)
base_entry = entry.Entry(None, None, None)
self.assertEqual(True, base_entry.ObtainContents())
def testUnknownEntry(self):
@ -73,16 +73,30 @@ class TestEntry(unittest.TestCase):
"""Test Entry.GetUniqueName"""
Node = collections.namedtuple('Node', ['name', 'parent'])
base_node = Node('root', None)
base_entry = entry.Entry(None, None, base_node, read_node=False)
base_entry = entry.Entry(None, None, base_node)
self.assertEqual('root', base_entry.GetUniqueName())
sub_node = Node('subnode', base_node)
sub_entry = entry.Entry(None, None, sub_node, read_node=False)
sub_entry = entry.Entry(None, None, sub_node)
self.assertEqual('root.subnode', sub_entry.GetUniqueName())
def testGetDefaultFilename(self):
"""Trivial test for this base class function"""
base_entry = entry.Entry(None, None, None, read_node=False)
base_entry = entry.Entry(None, None, None)
self.assertIsNone(base_entry.GetDefaultFilename())
def testBlobFdt(self):
"""Test the GetFdtEtype() method of the blob-dtb entries"""
base = entry.Entry.Create(None, self.GetNode(), 'blob-dtb')
self.assertIsNone(base.GetFdtEtype())
dtb = entry.Entry.Create(None, self.GetNode(), 'u-boot-dtb')
self.assertEqual('u-boot-dtb', dtb.GetFdtEtype())
def testWriteChildData(self):
"""Test the WriteChildData() method of the base class"""
base = entry.Entry.Create(None, self.GetNode(), 'blob-dtb')
self.assertTrue(base.WriteChildData(base))
if __name__ == "__main__":
unittest.main()

View file

@ -31,8 +31,8 @@ class Entry__testing(Entry):
return-invalid-entry: Return an invalid entry from GetOffsets()
return-unknown-contents: Refuse to provide any contents (to cause a
failure)
bad-update-contents: Implement ProcessContents() incorrectly so as to
cause a failure
bad-update-contents: Return a larger size in ProcessContents
bad-shrink-contents: Return a larger size in ProcessContents
never-complete-process-fdt: Refund to process the FDT (to cause a
failure)
require-args: Require that all used args are present (generating an
@ -42,12 +42,17 @@ class Entry__testing(Entry):
"""
def __init__(self, section, etype, node):
Entry.__init__(self, section, etype, node)
def ReadNode(self):
Entry.ReadNode(self)
self.return_invalid_entry = fdt_util.GetBool(self._node,
'return-invalid-entry')
self.return_unknown_contents = fdt_util.GetBool(self._node,
'return-unknown-contents')
self.bad_update_contents = fdt_util.GetBool(self._node,
'bad-update-contents')
self.bad_shrink_contents = fdt_util.GetBool(self._node,
'bad-shrink-contents')
self.return_contents_once = fdt_util.GetBool(self._node,
'return-contents-once')
self.bad_update_contents_twice = fdt_util.GetBool(self._node,
@ -73,7 +78,7 @@ class Entry__testing(Entry):
if self.force_bad_datatype:
self.GetEntryArgsOrProps([EntryArg('test-bad-datatype-arg', bool)])
self.return_contents = True
self.contents = b'a'
self.contents = b'aa'
def ObtainContents(self):
if self.return_unknown_contents or not self.return_contents:
@ -90,14 +95,25 @@ class Entry__testing(Entry):
return {}
def ProcessContents(self):
data = self.contents
if self.bad_update_contents:
# Request to update the contents with something larger, to cause a
# failure.
if self.bad_update_contents_twice:
self.contents += b'a'
data = self.data + b'a'
else:
self.contents = b'aa'
return self.ProcessContentsUpdate(self.contents)
data = b'aaa'
return self.ProcessContentsUpdate(data)
if self.bad_shrink_contents:
# Request to update the contents with something smaller, to cause a
# failure.
data = b'a'
return self.ProcessContentsUpdate(data)
if self.bad_shrink_contents:
# Request to update the contents with something smaller, to cause a
# failure.
data = b'a'
return self.ProcessContentsUpdate(data)
return True
def ProcessFdt(self, fdt):

View file

@ -67,15 +67,3 @@ class Entry_blob(Entry):
def GetDefaultFilename(self):
return self._filename
def ReadData(self, decomp=True):
indata = Entry.ReadData(self, decomp)
if decomp:
data = tools.Decompress(indata, self.compress)
if self.uncomp_size:
tout.Info("%s: Decompressing data size %#x with algo '%s' to data size %#x" %
(self.GetPath(), len(indata), self.compress,
len(data)))
else:
data = indata
return data

View file

@ -23,11 +23,42 @@ class Entry_blob_dtb(Entry_blob):
def ObtainContents(self):
"""Get the device-tree from the list held by the 'state' module"""
self._filename = self.GetDefaultFilename()
self._pathname, _ = state.GetFdtContents(self._filename)
self._pathname, _ = state.GetFdtContents(self.GetFdtEtype())
return Entry_blob.ReadBlobContents(self)
def ProcessContents(self):
"""Re-read the DTB contents so that we get any calculated properties"""
_, indata = state.GetFdtContents(self._filename)
_, indata = state.GetFdtContents(self.GetFdtEtype())
data = self.CompressData(indata)
return self.ProcessContentsUpdate(data)
def GetFdtEtype(self):
"""Get the entry type of this device tree
This can be 'u-boot-dtb', 'u-boot-spl-dtb' or 'u-boot-tpl-dtb'
Returns:
Entry type if any, e.g. 'u-boot-dtb'
"""
return None
def GetFdts(self):
"""Get the device trees used by this entry
Returns:
Dict:
key: Filename from this entry (without the path)
value: Tuple:
Fdt object for this dtb, or None if not available
Filename of file containing this dtb
"""
fname = self.GetDefaultFilename()
return {self.GetFdtEtype(): [self, fname]}
def WriteData(self, data, decomp=True):
ok = Entry_blob.WriteData(self, data, decomp)
# Update the state module, since it has the authoritative record of the
# device trees used. If we don't do this, then state.GetFdtContents()
# will still return the old contents
state.UpdateFdtContents(self.GetFdtEtype(), data)
return ok

View file

@ -168,8 +168,9 @@ class Entry_cbfs(Entry):
self._cbfs_arg = fdt_util.GetString(node, 'cbfs-arch', 'x86')
self._cbfs_entries = OrderedDict()
self._ReadSubnodes()
self.reader = None
def ObtainContents(self):
def ObtainContents(self, skip=None):
arch = cbfs_util.find_arch(self._cbfs_arg)
if arch is None:
self.Raise("Invalid architecture '%s'" % self._cbfs_arg)
@ -179,7 +180,7 @@ class Entry_cbfs(Entry):
for entry in self._cbfs_entries.values():
# First get the input data and put it in a file. If not available,
# try later.
if not entry.ObtainContents():
if entry != skip and not entry.ObtainContents():
return False
data = entry.GetData()
cfile = None
@ -202,7 +203,8 @@ class Entry_cbfs(Entry):
def _ReadSubnodes(self):
"""Read the subnodes to find out what should go in this IFWI"""
for node in self._node.subnodes:
entry = Entry.Create(self.section, node)
entry = Entry.Create(self, node)
entry.ReadNode()
entry._cbfs_name = fdt_util.GetString(node, 'cbfs-name', entry.name)
entry._type = fdt_util.GetString(node, 'cbfs-type')
compress = fdt_util.GetString(node, 'cbfs-compress', 'none')
@ -261,3 +263,19 @@ class Entry_cbfs(Entry):
def GetEntries(self):
return self._cbfs_entries
def ReadData(self, decomp=True):
data = Entry.ReadData(self, True)
return data
def ReadChildData(self, child, decomp=True):
if not self.reader:
data = Entry.ReadData(self, True)
self.reader = cbfs_util.CbfsReader(data)
reader = self.reader
cfile = reader.files.get(child.name)
return cfile.data if decomp else cfile.orig_data
def WriteChildData(self, child):
self.ObtainContents(skip=child)
return True

View file

@ -14,6 +14,7 @@ from entry import Entry
from fdt import Fdt
import state
import tools
import tout
FDTMAP_MAGIC = b'_FDTMAP_'
FDTMAP_HDR_LEN = 16
@ -59,6 +60,7 @@ class Entry_fdtmap(Entry):
Example output for a simple image with U-Boot and an FDT map:
/ {
image-name = "binman";
size = <0x00000112>;
image-pos = <0x00000000>;
offset = <0x00000000>;
@ -73,6 +75,9 @@ class Entry_fdtmap(Entry):
offset = <0x00000004>;
};
};
If allow-repack is used then 'orig-offset' and 'orig-size' properties are
added as necessary. See the binman README.
"""
def __init__(self, section, etype, node):
Entry.__init__(self, section, etype, node)
@ -91,29 +96,38 @@ class Entry_fdtmap(Entry):
with fsw.add_node(subnode.name):
_AddNode(subnode)
# Get the FDT data into an Fdt object
data = state.GetFdtContents()[1]
infdt = Fdt.FromData(data)
infdt.Scan()
data = state.GetFdtContents('fdtmap')[1]
# If we have an fdtmap it means that we are using this as the
# fdtmap for this image.
if data is None:
# Get the FDT data into an Fdt object
data = state.GetFdtContents()[1]
infdt = Fdt.FromData(data)
infdt.Scan()
# Find the node for the image containing the Fdt-map entry
path = self.section.GetPath()
node = infdt.GetNode(path)
if not node:
self.Raise("Internal error: Cannot locate node for path '%s'" %
path)
# Find the node for the image containing the Fdt-map entry
path = self.section.GetPath()
self.Detail("Fdtmap: Using section '%s' (path '%s')" %
(self.section.name, path))
node = infdt.GetNode(path)
if not node:
self.Raise("Internal error: Cannot locate node for path '%s'" %
path)
# Build a new tree with all nodes and properties starting from that node
fsw = libfdt.FdtSw()
fsw.finish_reservemap()
with fsw.add_node(''):
_AddNode(node)
fdt = fsw.as_fdt()
# Build a new tree with all nodes and properties starting from that
# node
fsw = libfdt.FdtSw()
fsw.finish_reservemap()
with fsw.add_node(''):
fsw.property_string('image-node', node.name)
_AddNode(node)
fdt = fsw.as_fdt()
# Pack this new FDT and return its contents
fdt.pack()
outfdt = Fdt.FromData(fdt.as_bytearray())
data = FDTMAP_MAGIC + tools.GetBytes(0, 8) + outfdt.GetContents()
# Pack this new FDT and return its contents
fdt.pack()
outfdt = Fdt.FromData(fdt.as_bytearray())
data = outfdt.GetContents()
data = FDTMAP_MAGIC + tools.GetBytes(0, 8) + data
return data
def ObtainContents(self):

View file

@ -23,6 +23,9 @@ class Entry_fill(Entry):
"""
def __init__(self, section, etype, node):
Entry.__init__(self, section, etype, node)
def ReadNode(self):
Entry.ReadNode(self)
if self.size is None:
self.Raise("'fill' entry must have a size property")
self.fill_value = fdt_util.GetByte(self._node, 'fill-byte', 0)

View file

@ -8,6 +8,8 @@
from entry import Entry
import fmap_util
import tools
from tools import ToHexSize
import tout
class Entry_fmap(Entry):
@ -26,7 +28,8 @@ class Entry_fmap(Entry):
When used, this entry will be populated with an FMAP which reflects the
entries in the current image. Note that any hierarchy is squashed, since
FMAP does not support this.
FMAP does not support this. Also, CBFS entries appear as a single entry -
the sub-entries are ignored.
"""
def __init__(self, section, etype, node):
Entry.__init__(self, section, etype, node)
@ -39,7 +42,9 @@ class Entry_fmap(Entry):
"""
def _AddEntries(areas, entry):
entries = entry.GetEntries()
if entries:
tout.Debug("fmap: Add entry '%s' type '%s' (%s subentries)" %
(entry.GetPath(), entry.etype, ToHexSize(entries)))
if entries and entry.etype != 'cbfs':
for subentry in entries.values():
_AddEntries(areas, subentry)
else:
@ -49,7 +54,7 @@ class Entry_fmap(Entry):
areas.append(fmap_util.FmapArea(pos or 0, entry.size or 0,
tools.FromUnicode(entry.name), 0))
entries = self.section.image.GetEntries()
entries = self.GetImage().GetEntries()
areas = []
for entry in entries.values():
_AddEntries(areas, entry)

View file

@ -86,8 +86,20 @@ class Entry_image_header(Entry):
if self.location not in ['start', 'end']:
self.Raise("Invalid location '%s', expected 'start' or 'end'" %
self.location)
image_size = self.section.GetImageSize() or 0
self.offset = (0 if self.location != 'end' else image_size - 8)
order = self.GetSiblingOrder()
if self.location != order and not self.section.GetSort():
self.Raise("Invalid sibling order '%s' for image-header: Must be at '%s' to match location" %
(order, self.location))
if self.location != 'end':
offset = 0
else:
image_size = self.section.GetImageSize()
if image_size is None:
# We don't know the image, but this must be the last entry,
# so we can assume it goes
offset = offset
else:
offset = image_size - IMAGE_HEADER_LEN
return Entry.Pack(self, offset)
def ProcessContents(self):

View file

@ -47,8 +47,12 @@ class Entry_intel_descriptor(Entry_blob):
def __init__(self, section, etype, node):
Entry_blob.__init__(self, section, etype, node)
self._regions = []
def Pack(self, offset):
"""Put this entry at the start of the image"""
if self.offset is None:
self.offset = self.section.GetStartOffset()
offset = self.section.GetStartOffset()
return Entry_blob.Pack(self, offset)
def GetOffsets(self):
offset = self.data.find(FD_SIGNATURE)

View file

@ -94,6 +94,7 @@ class Entry_intel_ifwi(Entry_blob):
"""Read the subnodes to find out what should go in this IFWI"""
for node in self._node.subnodes:
entry = Entry.Create(self.section, node)
entry.ReadNode()
entry._ifwi_replace = fdt_util.GetBool(node, 'replace')
entry._ifwi_subpart = fdt_util.GetString(node, 'ifwi-subpart')
entry._ifwi_entry_name = fdt_util.GetString(node, 'ifwi-entry')

View file

@ -17,6 +17,7 @@ import sys
from entry import Entry
import fdt_util
import tools
import tout
class Entry_section(Entry):
@ -45,29 +46,15 @@ class Entry_section(Entry):
def __init__(self, section, etype, node, test=False):
if not test:
Entry.__init__(self, section, etype, node)
if section:
self.image = section.image
self._entries = OrderedDict()
self._pad_byte = 0
self._sort = False
self._skip_at_start = None
self._end_4gb = False
if not test:
self._ReadNode()
self._ReadEntries()
def _Raise(self, msg):
"""Raises an error for this section
Args:
msg: Error message to use in the raise string
Raises:
ValueError()
"""
raise ValueError("Section '%s': %s" % (self._node.path, msg))
def _ReadNode(self):
def ReadNode(self):
"""Read properties from the image node"""
Entry.ReadNode(self)
self._pad_byte = fdt_util.GetInt(self._node, 'pad-byte', 0)
self._sort = fdt_util.GetBool(self._node, 'sort-by-offset')
self._end_4gb = fdt_util.GetBool(self._node, 'end-at-4gb')
@ -87,19 +74,32 @@ class Entry_section(Entry):
if filename:
self._filename = filename
self._ReadEntries()
def _ReadEntries(self):
for node in self._node.subnodes:
if node.name == 'hash':
continue
entry = Entry.Create(self, node)
entry.ReadNode()
entry.SetPrefix(self._name_prefix)
self._entries[node.name] = entry
def GetFdtSet(self):
fdt_set = set()
def _Raise(self, msg):
"""Raises an error for this section
Args:
msg: Error message to use in the raise string
Raises:
ValueError()
"""
raise ValueError("Section '%s': %s" % (self._node.path, msg))
def GetFdts(self):
fdts = {}
for entry in self._entries.values():
fdt_set.update(entry.GetFdtSet())
return fdt_set
fdts.update(entry.GetFdts())
return fdts
def ProcessFdt(self, fdt):
"""Allow entries to adjust the device tree
@ -149,6 +149,8 @@ class Entry_section(Entry):
base = self.pad_before + entry.offset - self._skip_at_start
section_data = (section_data[:base] + data +
section_data[base + len(data):])
self.Detail('GetData: %d entries, total size %#x' %
(len(self._entries), len(section_data)))
return section_data
def GetOffsets(self):
@ -371,7 +373,7 @@ class Entry_section(Entry):
Image size as an integer number of bytes, which may be None if the
image size is dynamic and its sections have not yet been packed
"""
return self.image.size
return self.GetImage().size
def FindEntryType(self, etype):
"""Find an entry type in the section
@ -460,3 +462,64 @@ class Entry_section(Entry):
self.image_pos, None, self.offset, self)
for entry in self._entries.values():
entry.ListEntries(entries, indent + 1)
def LoadData(self, decomp=True):
for entry in self._entries.values():
entry.LoadData(decomp)
self.Detail('Loaded data')
def GetImage(self):
"""Get the image containing this section
Note that a top-level section is actually an Image, so this function may
return self.
Returns:
Image object containing this section
"""
if not self.section:
return self
return self.section.GetImage()
def GetSort(self):
"""Check if the entries in this section will be sorted
Returns:
True if to be sorted, False if entries will be left in the order
they appear in the device tree
"""
return self._sort
def ReadData(self, decomp=True):
tout.Info("ReadData path='%s'" % self.GetPath())
parent_data = self.section.ReadData(True)
tout.Info('%s: Reading data from offset %#x-%#x, size %#x' %
(self.GetPath(), self.offset, self.offset + self.size,
self.size))
data = parent_data[self.offset:self.offset + self.size]
return data
def ReadChildData(self, child, decomp=True):
"""Read the data for a particular child entry
Args:
child: Child entry to read data for
decomp: True to return uncompressed data, False to leave the data
compressed if it is compressed
Returns:
Data contents of entry
"""
parent_data = self.ReadData(True)
data = parent_data[child.offset:child.offset + child.size]
if decomp:
indata = data
data = tools.Decompress(indata, child.compress)
if child.uncomp_size:
tout.Info("%s: Decompressing data size %#x with algo '%s' to data size %#x" %
(child.GetPath(), len(indata), child.compress,
len(data)))
return data
def WriteChildData(self, child):
return True

View file

@ -26,3 +26,6 @@ class Entry_u_boot_dtb(Entry_blob_dtb):
def GetDefaultFilename(self):
return 'u-boot.dtb'
def GetFdtEtype(self):
return 'u-boot-dtb'

View file

@ -36,6 +36,9 @@ class Entry_u_boot_dtb_with_ucode(Entry_blob_dtb):
def GetDefaultFilename(self):
return 'u-boot.dtb'
def GetFdtEtype(self):
return 'u-boot-dtb'
def ProcessFdt(self, fdt):
# So the module can be loaded without it
import fdt
@ -53,11 +56,11 @@ class Entry_u_boot_dtb_with_ucode(Entry_blob_dtb):
return True
# Remove the microcode
fname = self.GetDefaultFilename()
fdt = state.GetFdt(fname)
etype = self.GetFdtEtype()
fdt = state.GetFdtForEtype(etype)
self.ucode = fdt.GetNode('/microcode')
if not self.ucode:
raise self.Raise("No /microcode node found in '%s'" % fname)
raise self.Raise("No /microcode node found in '%s'" % etype)
# There's no need to collate it (move all microcode into one place)
# if we only have one chunk of microcode.

View file

@ -23,3 +23,6 @@ class Entry_u_boot_spl_dtb(Entry_blob_dtb):
def GetDefaultFilename(self):
return 'spl/u-boot-spl.dtb'
def GetFdtEtype(self):
return 'u-boot-spl-dtb'

View file

@ -23,3 +23,6 @@ class Entry_u_boot_tpl_dtb(Entry_blob_dtb):
def GetDefaultFilename(self):
return 'tpl/u-boot-tpl.dtb'
def GetFdtEtype(self):
return 'u-boot-tpl-dtb'

View file

@ -23,3 +23,6 @@ class Entry_u_boot_tpl_dtb_with_ucode(Entry_u_boot_dtb_with_ucode):
def GetDefaultFilename(self):
return 'tpl/u-boot-tpl.dtb'
def GetFdtEtype(self):
return 'u-boot-tpl-dtb'

View file

@ -69,8 +69,15 @@ FILES_DATA = (b"sorry I'm late\nOh, don't bother apologising, I'm " +
COMPRESS_DATA = b'compress xxxxxxxxxxxxxxxxxxxxxx data'
REFCODE_DATA = b'refcode'
# The expected size for the device tree in some tests
EXTRACT_DTB_SIZE = 0x3c9
# Properties expected to be in the device tree when update_dtb is used
BASE_DTB_PROPS = ['offset', 'size', 'image-pos']
# Extra properties expected to be in the device tree when allow-repack is used
REPACK_DTB_PROPS = ['orig-offset', 'orig-size']
class TestFunctional(unittest.TestCase):
"""Functional tests for binman
@ -180,6 +187,13 @@ class TestFunctional(unittest.TestCase):
if not self.have_lz4:
self.skipTest('lz4 --no-frame-crc not available')
def _CleanupOutputDir(self):
"""Remove the temporary output directory"""
if self.preserve_outdirs:
print('Preserving output dir: %s' % tools.outdir)
else:
tools._FinaliseForTest()
def setUp(self):
# Enable this to turn on debugging output
# tout.Init(tout.DEBUG)
@ -187,10 +201,29 @@ class TestFunctional(unittest.TestCase):
def tearDown(self):
"""Remove the temporary output directory"""
if self.preserve_outdirs:
print('Preserving output dir: %s' % tools.outdir)
else:
tools._FinaliseForTest()
self._CleanupOutputDir()
def _SetupImageInTmpdir(self):
"""Set up the output image in a new temporary directory
This is used when an image has been generated in the output directory,
but we want to run binman again. This will create a new output
directory and fail to delete the original one.
This creates a new temporary directory, copies the image to it (with a
new name) and removes the old output directory.
Returns:
Tuple:
Temporary directory to use
New image filename
"""
image_fname = tools.GetOutputFilename('image.bin')
tmpdir = tempfile.mkdtemp(prefix='binman.')
updated_fname = os.path.join(tmpdir, 'image-updated.bin')
tools.WriteFile(updated_fname, tools.ReadFile(image_fname))
self._CleanupOutputDir()
return tmpdir, updated_fname
@classmethod
def _ResetDtbs(self):
@ -283,12 +316,12 @@ class TestFunctional(unittest.TestCase):
Returns:
Contents of device-tree binary
"""
tools.PrepareOutputDir(None)
dtb = fdt_util.EnsureCompiled(self.TestFile(fname))
tmpdir = tempfile.mkdtemp(prefix='binmant.')
dtb = fdt_util.EnsureCompiled(self.TestFile(fname), tmpdir)
with open(dtb, 'rb') as fd:
data = fd.read()
TestFunctional._MakeInputFile(outfile, data)
tools.FinaliseOutputDir()
shutil.rmtree(tmpdir)
return data
def _GetDtbContentsForSplTpl(self, dtb_data, name):
@ -1229,7 +1262,7 @@ class TestFunctional(unittest.TestCase):
state.SetAllowEntryExpansion(False)
with self.assertRaises(ValueError) as e:
self._DoReadFile('059_change_size.dts', True)
self.assertIn("Node '/binman/_testing': Cannot update entry size from 1 to 2",
self.assertIn("Node '/binman/_testing': Cannot update entry size from 2 to 3",
str(e.exception))
finally:
state.SetAllowEntryExpansion(True)
@ -1240,12 +1273,12 @@ class TestFunctional(unittest.TestCase):
update_dtb=True)
dtb = fdt.Fdt(out_dtb_fname)
dtb.Scan()
props = self._GetPropTree(dtb, ['offset', 'size', 'image-pos'])
props = self._GetPropTree(dtb, BASE_DTB_PROPS + REPACK_DTB_PROPS)
self.assertEqual({
'image-pos': 0,
'offset': 0,
'_testing:offset': 32,
'_testing:size': 1,
'_testing:size': 2,
'_testing:image-pos': 32,
'section@0/u-boot:offset': 0,
'section@0/u-boot:size': len(U_BOOT_DATA),
@ -1552,6 +1585,7 @@ class TestFunctional(unittest.TestCase):
self.assertFalse(os.path.exists(tools.GetOutputFilename('image1.bin')))
self.assertTrue(os.path.exists(tools.GetOutputFilename('image2.bin')))
self._CleanupOutputDir()
def testUpdateFdtAll(self):
"""Test that all device trees are updated with offset/size info"""
@ -1583,8 +1617,8 @@ class TestFunctional(unittest.TestCase):
for item in ['', 'spl', 'tpl']:
dtb = fdt.Fdt.FromData(data[start:])
dtb.Scan()
props = self._GetPropTree(dtb, ['offset', 'size', 'image-pos',
'spl', 'tpl'])
props = self._GetPropTree(dtb, BASE_DTB_PROPS + REPACK_DTB_PROPS +
['spl', 'tpl'])
expected = dict(base_expected)
if item:
expected[item] = 0
@ -2052,8 +2086,7 @@ class TestFunctional(unittest.TestCase):
fdt_data = fdtmap_data[16:]
dtb = fdt.Fdt.FromData(fdt_data)
dtb.Scan()
props = self._GetPropTree(dtb, ['offset', 'size', 'image-pos'],
prefix='/')
props = self._GetPropTree(dtb, BASE_DTB_PROPS, prefix='/')
self.assertEqual({
'image-pos': 0,
'offset': 0,
@ -2129,23 +2162,23 @@ class TestFunctional(unittest.TestCase):
def testEntryExpand(self):
"""Test expanding an entry after it is packed"""
data = self._DoReadFile('121_entry_expand.dts')
self.assertEqual(b'aa', data[:2])
self.assertEqual(U_BOOT_DATA, data[2:2 + len(U_BOOT_DATA)])
self.assertEqual(b'aa', data[-2:])
self.assertEqual(b'aaa', data[:3])
self.assertEqual(U_BOOT_DATA, data[3:3 + len(U_BOOT_DATA)])
self.assertEqual(b'aaa', data[-3:])
def testEntryExpandBad(self):
"""Test expanding an entry after it is packed, twice"""
with self.assertRaises(ValueError) as e:
self._DoReadFile('122_entry_expand_twice.dts')
self.assertIn("Image '/binman': Entries expanded after packing",
self.assertIn("Image '/binman': Entries changed size after packing",
str(e.exception))
def testEntryExpandSection(self):
"""Test expanding an entry within a section after it is packed"""
data = self._DoReadFile('123_entry_expand_section.dts')
self.assertEqual(b'aa', data[:2])
self.assertEqual(U_BOOT_DATA, data[2:2 + len(U_BOOT_DATA)])
self.assertEqual(b'aa', data[-2:])
self.assertEqual(b'aaa', data[:3])
self.assertEqual(U_BOOT_DATA, data[3:3 + len(U_BOOT_DATA)])
self.assertEqual(b'aaa', data[-3:])
def testCompressDtb(self):
"""Test that compress of device-tree files is supported"""
@ -2172,8 +2205,7 @@ class TestFunctional(unittest.TestCase):
update_dtb=True)
dtb = fdt.Fdt(out_dtb_fname)
dtb.Scan()
props = self._GetPropTree(dtb, ['offset', 'size', 'image-pos',
'uncomp-size'])
props = self._GetPropTree(dtb, BASE_DTB_PROPS + ['uncomp-size'])
del props['cbfs/u-boot:size']
self.assertEqual({
'offset': 0,
@ -2333,7 +2365,7 @@ class TestFunctional(unittest.TestCase):
image_fname = tools.GetOutputFilename('image.bin')
image = Image.FromFile(image_fname)
self.assertTrue(isinstance(image, Image))
self.assertEqual('image', image.image_name)
self.assertEqual('image', image.image_name[-5:])
def testReadImageFail(self):
"""Test failing to read an image image's FDT map"""
@ -2355,9 +2387,12 @@ class TestFunctional(unittest.TestCase):
fdt_size = entries['section'].GetEntries()['u-boot-dtb'].size
fdtmap_offset = entries['fdtmap'].offset
image_fname = tools.GetOutputFilename('image.bin')
with test_util.capture_sys_output() as (stdout, stderr):
self._DoBinman('ls', '-i', image_fname)
try:
tmpdir, updated_fname = self._SetupImageInTmpdir()
with test_util.capture_sys_output() as (stdout, stderr):
self._DoBinman('ls', '-i', updated_fname)
finally:
shutil.rmtree(tmpdir)
lines = stdout.getvalue().splitlines()
expected = [
'Name Image-pos Size Entry-type Offset Uncomp-size',
@ -2369,7 +2404,7 @@ class TestFunctional(unittest.TestCase):
' u-boot 138 4 u-boot 38',
' u-boot-dtb 180 10f u-boot-dtb 80 3c9',
' u-boot-dtb 500 %x u-boot-dtb 400 3c9' % fdt_size,
' fdtmap %x 395 fdtmap %x' %
' fdtmap %x 3b4 fdtmap %x' %
(fdtmap_offset, fdtmap_offset),
' image-header bf8 8 image-header bf8',
]
@ -2378,9 +2413,12 @@ class TestFunctional(unittest.TestCase):
def testListCmdFail(self):
"""Test failing to list an image"""
self._DoReadFile('005_simple.dts')
image_fname = tools.GetOutputFilename('image.bin')
with self.assertRaises(ValueError) as e:
self._DoBinman('ls', '-i', image_fname)
try:
tmpdir, updated_fname = self._SetupImageInTmpdir()
with self.assertRaises(ValueError) as e:
self._DoBinman('ls', '-i', updated_fname)
finally:
shutil.rmtree(tmpdir)
self.assertIn("Cannot find FDT map in image", str(e.exception))
def _RunListCmd(self, paths, expected):
@ -2480,7 +2518,7 @@ class TestFunctional(unittest.TestCase):
def testExtractCbfsRaw(self):
"""Test extracting CBFS compressed data without decompressing it"""
data = self._RunExtractCmd('section/cbfs/u-boot-dtb', decomp=False)
dtb = tools.Decompress(data, 'lzma')
dtb = tools.Decompress(data, 'lzma', with_header=False)
self.assertEqual(EXTRACT_DTB_SIZE, len(dtb))
def testExtractBadEntry(self):
@ -2506,10 +2544,14 @@ class TestFunctional(unittest.TestCase):
"""Test extracting a file fron an image on the command line"""
self._CheckLz4()
self._DoReadFileRealDtb('130_list_fdtmap.dts')
image_fname = tools.GetOutputFilename('image.bin')
fname = os.path.join(self._indir, 'output.extact')
with test_util.capture_sys_output() as (stdout, stderr):
self._DoBinman('extract', '-i', image_fname, 'u-boot', '-f', fname)
try:
tmpdir, updated_fname = self._SetupImageInTmpdir()
with test_util.capture_sys_output() as (stdout, stderr):
self._DoBinman('extract', '-i', updated_fname, 'u-boot',
'-f', fname)
finally:
shutil.rmtree(tmpdir)
data = tools.ReadFile(fname)
self.assertEqual(U_BOOT_DATA, data)
@ -2641,7 +2683,7 @@ class TestFunctional(unittest.TestCase):
image_fname = tools.GetOutputFilename('image.bin')
with self.assertRaises(ValueError) as e:
control.ExtractEntries(image_fname, 'fname', None, [])
self.assertIn('Must specify an entry path to write with -o',
self.assertIn('Must specify an entry path to write with -f',
str(e.exception))
def testExtractTooManyEntryPaths(self):
@ -2651,7 +2693,7 @@ class TestFunctional(unittest.TestCase):
image_fname = tools.GetOutputFilename('image.bin')
with self.assertRaises(ValueError) as e:
control.ExtractEntries(image_fname, 'fname', None, ['a', 'b'])
self.assertIn('Must specify exactly one entry path to write with -o',
self.assertIn('Must specify exactly one entry path to write with -f',
str(e.exception))
def testPackAlignSection(self):
@ -2719,6 +2761,481 @@ class TestFunctional(unittest.TestCase):
self.assertEqual(len(U_BOOT_DATA), entry.contents_size)
self.assertEqual(len(U_BOOT_DATA), entry.size)
def _RunReplaceCmd(self, entry_name, data, decomp=True, allow_resize=True,
dts='132_replace.dts'):
"""Replace an entry in an image
This writes the entry data to update it, then opens the updated file and
returns the value that it now finds there.
Args:
entry_name: Entry name to replace
data: Data to replace it with
decomp: True to compress the data if needed, False if data is
already compressed so should be used as is
allow_resize: True to allow entries to change size, False to raise
an exception
Returns:
Tuple:
data from entry
data from fdtmap (excluding header)
Image object that was modified
"""
dtb_data = self._DoReadFileDtb(dts, use_real_dtb=True,
update_dtb=True)[1]
self.assertIn('image', control.images)
image = control.images['image']
entries = image.GetEntries()
orig_dtb_data = entries['u-boot-dtb'].data
orig_fdtmap_data = entries['fdtmap'].data
image_fname = tools.GetOutputFilename('image.bin')
updated_fname = tools.GetOutputFilename('image-updated.bin')
tools.WriteFile(updated_fname, tools.ReadFile(image_fname))
image = control.WriteEntry(updated_fname, entry_name, data, decomp,
allow_resize)
data = control.ReadEntry(updated_fname, entry_name, decomp)
# The DT data should not change unless resized:
if not allow_resize:
new_dtb_data = entries['u-boot-dtb'].data
self.assertEqual(new_dtb_data, orig_dtb_data)
new_fdtmap_data = entries['fdtmap'].data
self.assertEqual(new_fdtmap_data, orig_fdtmap_data)
return data, orig_fdtmap_data[fdtmap.FDTMAP_HDR_LEN:], image
def testReplaceSimple(self):
"""Test replacing a single file"""
expected = b'x' * len(U_BOOT_DATA)
data, expected_fdtmap, _ = self._RunReplaceCmd('u-boot', expected,
allow_resize=False)
self.assertEqual(expected, data)
# Test that the state looks right. There should be an FDT for the fdtmap
# that we jsut read back in, and it should match what we find in the
# 'control' tables. Checking for an FDT that does not exist should
# return None.
path, fdtmap = state.GetFdtContents('fdtmap')
self.assertIsNotNone(path)
self.assertEqual(expected_fdtmap, fdtmap)
dtb = state.GetFdtForEtype('fdtmap')
self.assertEqual(dtb.GetContents(), fdtmap)
missing_path, missing_fdtmap = state.GetFdtContents('missing')
self.assertIsNone(missing_path)
self.assertIsNone(missing_fdtmap)
missing_dtb = state.GetFdtForEtype('missing')
self.assertIsNone(missing_dtb)
self.assertEqual('/binman', state.fdt_path_prefix)
def testReplaceResizeFail(self):
"""Test replacing a file by something larger"""
expected = U_BOOT_DATA + b'x'
with self.assertRaises(ValueError) as e:
self._RunReplaceCmd('u-boot', expected, allow_resize=False,
dts='139_replace_repack.dts')
self.assertIn("Node '/u-boot': Entry data size does not match, but resize is disabled",
str(e.exception))
def testReplaceMulti(self):
"""Test replacing entry data where multiple images are generated"""
data = self._DoReadFileDtb('133_replace_multi.dts', use_real_dtb=True,
update_dtb=True)[0]
expected = b'x' * len(U_BOOT_DATA)
updated_fname = tools.GetOutputFilename('image-updated.bin')
tools.WriteFile(updated_fname, data)
entry_name = 'u-boot'
control.WriteEntry(updated_fname, entry_name, expected,
allow_resize=False)
data = control.ReadEntry(updated_fname, entry_name)
self.assertEqual(expected, data)
# Check the state looks right.
self.assertEqual('/binman/image', state.fdt_path_prefix)
# Now check we can write the first image
image_fname = tools.GetOutputFilename('first-image.bin')
updated_fname = tools.GetOutputFilename('first-updated.bin')
tools.WriteFile(updated_fname, tools.ReadFile(image_fname))
entry_name = 'u-boot'
control.WriteEntry(updated_fname, entry_name, expected,
allow_resize=False)
data = control.ReadEntry(updated_fname, entry_name)
self.assertEqual(expected, data)
# Check the state looks right.
self.assertEqual('/binman/first-image', state.fdt_path_prefix)
def testUpdateFdtAllRepack(self):
"""Test that all device trees are updated with offset/size info"""
data = self._DoReadFileRealDtb('134_fdt_update_all_repack.dts')
SECTION_SIZE = 0x300
DTB_SIZE = 602
FDTMAP_SIZE = 608
base_expected = {
'offset': 0,
'size': SECTION_SIZE + DTB_SIZE * 2 + FDTMAP_SIZE,
'image-pos': 0,
'section:offset': 0,
'section:size': SECTION_SIZE,
'section:image-pos': 0,
'section/u-boot-dtb:offset': 4,
'section/u-boot-dtb:size': 636,
'section/u-boot-dtb:image-pos': 4,
'u-boot-spl-dtb:offset': SECTION_SIZE,
'u-boot-spl-dtb:size': DTB_SIZE,
'u-boot-spl-dtb:image-pos': SECTION_SIZE,
'u-boot-tpl-dtb:offset': SECTION_SIZE + DTB_SIZE,
'u-boot-tpl-dtb:image-pos': SECTION_SIZE + DTB_SIZE,
'u-boot-tpl-dtb:size': DTB_SIZE,
'fdtmap:offset': SECTION_SIZE + DTB_SIZE * 2,
'fdtmap:size': FDTMAP_SIZE,
'fdtmap:image-pos': SECTION_SIZE + DTB_SIZE * 2,
}
main_expected = {
'section:orig-size': SECTION_SIZE,
'section/u-boot-dtb:orig-offset': 4,
}
# We expect three device-tree files in the output, with the first one
# within a fixed-size section.
# Read them in sequence. We look for an 'spl' property in the SPL tree,
# and 'tpl' in the TPL tree, to make sure they are distinct from the
# main U-Boot tree. All three should have the same positions and offset
# except that the main tree should include the main_expected properties
start = 4
for item in ['', 'spl', 'tpl', None]:
if item is None:
start += 16 # Move past fdtmap header
dtb = fdt.Fdt.FromData(data[start:])
dtb.Scan()
props = self._GetPropTree(dtb,
BASE_DTB_PROPS + REPACK_DTB_PROPS + ['spl', 'tpl'],
prefix='/' if item is None else '/binman/')
expected = dict(base_expected)
if item:
expected[item] = 0
else:
# Main DTB and fdtdec should include the 'orig-' properties
expected.update(main_expected)
# Helpful for debugging:
#for prop in sorted(props):
#print('prop %s %s %s' % (prop, props[prop], expected[prop]))
self.assertEqual(expected, props)
if item == '':
start = SECTION_SIZE
else:
start += dtb._fdt_obj.totalsize()
def testFdtmapHeaderMiddle(self):
"""Test an FDT map in the middle of an image when it should be at end"""
with self.assertRaises(ValueError) as e:
self._DoReadFileRealDtb('135_fdtmap_hdr_middle.dts')
self.assertIn("Invalid sibling order 'middle' for image-header: Must be at 'end' to match location",
str(e.exception))
def testFdtmapHeaderStartBad(self):
"""Test an FDT map in middle of an image when it should be at start"""
with self.assertRaises(ValueError) as e:
self._DoReadFileRealDtb('136_fdtmap_hdr_startbad.dts')
self.assertIn("Invalid sibling order 'end' for image-header: Must be at 'start' to match location",
str(e.exception))
def testFdtmapHeaderEndBad(self):
"""Test an FDT map at the start of an image when it should be at end"""
with self.assertRaises(ValueError) as e:
self._DoReadFileRealDtb('137_fdtmap_hdr_endbad.dts')
self.assertIn("Invalid sibling order 'start' for image-header: Must be at 'end' to match location",
str(e.exception))
def testFdtmapHeaderNoSize(self):
"""Test an image header at the end of an image with undefined size"""
self._DoReadFileRealDtb('138_fdtmap_hdr_nosize.dts')
def testReplaceResize(self):
"""Test replacing a single file in an entry with a larger file"""
expected = U_BOOT_DATA + b'x'
data, _, image = self._RunReplaceCmd('u-boot', expected,
dts='139_replace_repack.dts')
self.assertEqual(expected, data)
entries = image.GetEntries()
dtb_data = entries['u-boot-dtb'].data
dtb = fdt.Fdt.FromData(dtb_data)
dtb.Scan()
# The u-boot section should now be larger in the dtb
node = dtb.GetNode('/binman/u-boot')
self.assertEqual(len(expected), fdt_util.GetInt(node, 'size'))
# Same for the fdtmap
fdata = entries['fdtmap'].data
fdtb = fdt.Fdt.FromData(fdata[fdtmap.FDTMAP_HDR_LEN:])
fdtb.Scan()
fnode = fdtb.GetNode('/u-boot')
self.assertEqual(len(expected), fdt_util.GetInt(fnode, 'size'))
def testReplaceResizeNoRepack(self):
"""Test replacing an entry with a larger file when not allowed"""
expected = U_BOOT_DATA + b'x'
with self.assertRaises(ValueError) as e:
self._RunReplaceCmd('u-boot', expected)
self.assertIn('Entry data size does not match, but allow-repack is not present for this image',
str(e.exception))
def testEntryShrink(self):
"""Test contracting an entry after it is packed"""
try:
state.SetAllowEntryContraction(True)
data = self._DoReadFileDtb('140_entry_shrink.dts',
update_dtb=True)[0]
finally:
state.SetAllowEntryContraction(False)
self.assertEqual(b'a', data[:1])
self.assertEqual(U_BOOT_DATA, data[1:1 + len(U_BOOT_DATA)])
self.assertEqual(b'a', data[-1:])
def testEntryShrinkFail(self):
"""Test not being allowed to contract an entry after it is packed"""
data = self._DoReadFileDtb('140_entry_shrink.dts', update_dtb=True)[0]
# In this case there is a spare byte at the end of the data. The size of
# the contents is only 1 byte but we still have the size before it
# shrunk.
self.assertEqual(b'a\0', data[:2])
self.assertEqual(U_BOOT_DATA, data[2:2 + len(U_BOOT_DATA)])
self.assertEqual(b'a\0', data[-2:])
def testDescriptorOffset(self):
"""Test that the Intel descriptor is always placed at at the start"""
data = self._DoReadFileDtb('141_descriptor_offset.dts')
image = control.images['image']
entries = image.GetEntries()
desc = entries['intel-descriptor']
self.assertEqual(0xff800000, desc.offset);
self.assertEqual(0xff800000, desc.image_pos);
def testReplaceCbfs(self):
"""Test replacing a single file in CBFS without changing the size"""
self._CheckLz4()
expected = b'x' * len(U_BOOT_DATA)
data = self._DoReadFileRealDtb('142_replace_cbfs.dts')
updated_fname = tools.GetOutputFilename('image-updated.bin')
tools.WriteFile(updated_fname, data)
entry_name = 'section/cbfs/u-boot'
control.WriteEntry(updated_fname, entry_name, expected,
allow_resize=True)
data = control.ReadEntry(updated_fname, entry_name)
self.assertEqual(expected, data)
def testReplaceResizeCbfs(self):
"""Test replacing a single file in CBFS with one of a different size"""
self._CheckLz4()
expected = U_BOOT_DATA + b'x'
data = self._DoReadFileRealDtb('142_replace_cbfs.dts')
updated_fname = tools.GetOutputFilename('image-updated.bin')
tools.WriteFile(updated_fname, data)
entry_name = 'section/cbfs/u-boot'
control.WriteEntry(updated_fname, entry_name, expected,
allow_resize=True)
data = control.ReadEntry(updated_fname, entry_name)
self.assertEqual(expected, data)
def _SetupForReplace(self):
"""Set up some files to use to replace entries
This generates an image, copies it to a new file, extracts all the files
in it and updates some of them
Returns:
List
Image filename
Output directory
Expected values for updated entries, each a string
"""
data = self._DoReadFileRealDtb('143_replace_all.dts')
updated_fname = tools.GetOutputFilename('image-updated.bin')
tools.WriteFile(updated_fname, data)
outdir = os.path.join(self._indir, 'extract')
einfos = control.ExtractEntries(updated_fname, None, outdir, [])
expected1 = b'x' + U_BOOT_DATA + b'y'
u_boot_fname1 = os.path.join(outdir, 'u-boot')
tools.WriteFile(u_boot_fname1, expected1)
expected2 = b'a' + U_BOOT_DATA + b'b'
u_boot_fname2 = os.path.join(outdir, 'u-boot2')
tools.WriteFile(u_boot_fname2, expected2)
expected_text = b'not the same text'
text_fname = os.path.join(outdir, 'text')
tools.WriteFile(text_fname, expected_text)
dtb_fname = os.path.join(outdir, 'u-boot-dtb')
dtb = fdt.FdtScan(dtb_fname)
node = dtb.GetNode('/binman/text')
node.AddString('my-property', 'the value')
dtb.Sync(auto_resize=True)
dtb.Flush()
return updated_fname, outdir, expected1, expected2, expected_text
def _CheckReplaceMultiple(self, entry_paths):
"""Handle replacing the contents of multiple entries
Args:
entry_paths: List of entry paths to replace
Returns:
List
Dict of entries in the image:
key: Entry name
Value: Entry object
Expected values for updated entries, each a string
"""
updated_fname, outdir, expected1, expected2, expected_text = (
self._SetupForReplace())
control.ReplaceEntries(updated_fname, None, outdir, entry_paths)
image = Image.FromFile(updated_fname)
image.LoadData()
return image.GetEntries(), expected1, expected2, expected_text
def testReplaceAll(self):
"""Test replacing the contents of all entries"""
entries, expected1, expected2, expected_text = (
self._CheckReplaceMultiple([]))
data = entries['u-boot'].data
self.assertEqual(expected1, data)
data = entries['u-boot2'].data
self.assertEqual(expected2, data)
data = entries['text'].data
self.assertEqual(expected_text, data)
# Check that the device tree is updated
data = entries['u-boot-dtb'].data
dtb = fdt.Fdt.FromData(data)
dtb.Scan()
node = dtb.GetNode('/binman/text')
self.assertEqual('the value', node.props['my-property'].value)
def testReplaceSome(self):
"""Test replacing the contents of a few entries"""
entries, expected1, expected2, expected_text = (
self._CheckReplaceMultiple(['u-boot2', 'text']))
# This one should not change
data = entries['u-boot'].data
self.assertEqual(U_BOOT_DATA, data)
data = entries['u-boot2'].data
self.assertEqual(expected2, data)
data = entries['text'].data
self.assertEqual(expected_text, data)
def testReplaceCmd(self):
"""Test replacing a file fron an image on the command line"""
self._DoReadFileRealDtb('143_replace_all.dts')
try:
tmpdir, updated_fname = self._SetupImageInTmpdir()
fname = os.path.join(tmpdir, 'update-u-boot.bin')
expected = b'x' * len(U_BOOT_DATA)
tools.WriteFile(fname, expected)
self._DoBinman('replace', '-i', updated_fname, 'u-boot', '-f', fname)
data = tools.ReadFile(updated_fname)
self.assertEqual(expected, data[:len(expected)])
map_fname = os.path.join(tmpdir, 'image-updated.map')
self.assertFalse(os.path.exists(map_fname))
finally:
shutil.rmtree(tmpdir)
def testReplaceCmdSome(self):
"""Test replacing some files fron an image on the command line"""
updated_fname, outdir, expected1, expected2, expected_text = (
self._SetupForReplace())
self._DoBinman('replace', '-i', updated_fname, '-I', outdir,
'u-boot2', 'text')
tools.PrepareOutputDir(None)
image = Image.FromFile(updated_fname)
image.LoadData()
entries = image.GetEntries()
# This one should not change
data = entries['u-boot'].data
self.assertEqual(U_BOOT_DATA, data)
data = entries['u-boot2'].data
self.assertEqual(expected2, data)
data = entries['text'].data
self.assertEqual(expected_text, data)
def testReplaceMissing(self):
"""Test replacing entries where the file is missing"""
updated_fname, outdir, expected1, expected2, expected_text = (
self._SetupForReplace())
# Remove one of the files, to generate a warning
u_boot_fname1 = os.path.join(outdir, 'u-boot')
os.remove(u_boot_fname1)
with test_util.capture_sys_output() as (stdout, stderr):
control.ReplaceEntries(updated_fname, None, outdir, [])
self.assertIn("Skipping entry '/u-boot' from missing file",
stdout.getvalue())
def testReplaceCmdMap(self):
"""Test replacing a file fron an image on the command line"""
self._DoReadFileRealDtb('143_replace_all.dts')
try:
tmpdir, updated_fname = self._SetupImageInTmpdir()
fname = os.path.join(self._indir, 'update-u-boot.bin')
expected = b'x' * len(U_BOOT_DATA)
tools.WriteFile(fname, expected)
self._DoBinman('replace', '-i', updated_fname, 'u-boot',
'-f', fname, '-m')
map_fname = os.path.join(tmpdir, 'image-updated.map')
self.assertTrue(os.path.exists(map_fname))
finally:
shutil.rmtree(tmpdir)
def testReplaceNoEntryPaths(self):
"""Test replacing an entry without an entry path"""
self._DoReadFileRealDtb('143_replace_all.dts')
image_fname = tools.GetOutputFilename('image.bin')
with self.assertRaises(ValueError) as e:
control.ReplaceEntries(image_fname, 'fname', None, [])
self.assertIn('Must specify an entry path to read with -f',
str(e.exception))
def testReplaceTooManyEntryPaths(self):
"""Test extracting some entries"""
self._DoReadFileRealDtb('143_replace_all.dts')
image_fname = tools.GetOutputFilename('image.bin')
with self.assertRaises(ValueError) as e:
control.ReplaceEntries(image_fname, 'fname', None, ['a', 'b'])
self.assertIn('Must specify exactly one entry path to write with -f',
str(e.exception))
if __name__ == "__main__":
unittest.main()

View file

@ -10,6 +10,7 @@ from __future__ import print_function
from collections import OrderedDict
import fnmatch
from operator import attrgetter
import os
import re
import sys
@ -32,22 +33,37 @@ class Image(section.Entry_section):
Attributes:
filename: Output filename for image
image_node: Name of node containing the description for this image
fdtmap_dtb: Fdt object for the fdtmap when loading from a file
fdtmap_data: Contents of the fdtmap when loading from a file
allow_repack: True to add properties to allow the image to be safely
repacked later
Args:
copy_to_orig: Copy offset/size to orig_offset/orig_size after reading
from the device tree
test: True if this is being called from a test of Images. This this case
there is no device tree defining the structure of the section, so
we create a section manually.
"""
def __init__(self, name, node, test=False):
self.image = self
section.Entry_section.__init__(self, None, 'section', node, test)
def __init__(self, name, node, copy_to_orig=True, test=False):
section.Entry_section.__init__(self, None, 'section', node, test=test)
self.copy_to_orig = copy_to_orig
self.name = 'main-section'
self.image_name = name
self._filename = '%s.bin' % self.image_name
self.fdtmap_dtb = None
self.fdtmap_data = None
self.allow_repack = False
if not test:
filename = fdt_util.GetString(self._node, 'filename')
if filename:
self._filename = filename
self.ReadNode()
def ReadNode(self):
section.Entry_section.ReadNode(self)
filename = fdt_util.GetString(self._node, 'filename')
if filename:
self._filename = filename
self.allow_repack = fdt_util.GetBool(self._node, 'allow-repack')
@classmethod
def FromFile(cls, fname):
@ -78,12 +94,22 @@ class Image(section.Entry_section):
data[pos + fdtmap.FDTMAP_HDR_LEN:pos + 256])
dtb_size = probe_dtb.GetFdtObj().totalsize()
fdtmap_data = data[pos:pos + dtb_size + fdtmap.FDTMAP_HDR_LEN]
dtb = fdt.Fdt.FromData(fdtmap_data[fdtmap.FDTMAP_HDR_LEN:])
fdt_data = fdtmap_data[fdtmap.FDTMAP_HDR_LEN:]
out_fname = tools.GetOutputFilename('fdtmap.in.dtb')
tools.WriteFile(out_fname, fdt_data)
dtb = fdt.Fdt(out_fname)
dtb.Scan()
# Return an Image with the associated nodes
image = Image('image', dtb.GetRoot())
root = dtb.GetRoot()
image = Image('image', root, copy_to_orig=False)
image.image_node = fdt_util.GetString(root, 'image-node', 'image')
image.fdtmap_dtb = dtb
image.fdtmap_data = fdtmap_data
image._data = data
image._filename = fname
image.image_name, _ = os.path.splitext(fname)
return image
def Raise(self, msg):
@ -117,16 +143,14 @@ class Image(section.Entry_section):
"""Write symbol values into binary files for access at run time"""
section.Entry_section.WriteSymbols(self, self)
def BuildSection(self, fd, base_offset):
"""Write the section to a file"""
fd.seek(base_offset)
fd.write(self.GetData())
def BuildImage(self):
"""Write the image to a file"""
fname = tools.GetOutputFilename(self._filename)
tout.Info("Writing image to '%s'" % fname)
with open(fname, 'wb') as fd:
self.BuildSection(fd, 0)
data = self.GetData()
fd.write(data)
tout.Info("Wrote %#x bytes" % len(data))
def WriteMap(self):
"""Write a map of the image to a .map file

View file

@ -8,12 +8,24 @@
import hashlib
import re
import fdt
import os
import tools
import tout
# Records the device-tree files known to binman, keyed by filename (e.g.
# 'u-boot-spl.dtb')
fdt_files = {}
# Records the device-tree files known to binman, keyed by entry type (e.g.
# 'u-boot-spl-dtb'). These are the output FDT files, which can be updated by
# binman. They have been copied to <xxx>.out files.
#
# key: entry type
# value: tuple:
# Fdt object
# Filename
# Entry object, or None if not known
output_fdt_info = {}
# Prefix to add to an fdtmap path to turn it into a path to the /binman node
fdt_path_prefix = ''
# Arguments passed to binman to provide arguments to entries
entry_args = {}
@ -22,12 +34,6 @@ entry_args = {}
# ftest.py)
use_fake_dtb = False
# Set of all device tree files references by images
fdt_set = set()
# Same as above, but excluding the main one
fdt_subset = set()
# The DTB which contains the full image information
main_dtb = None
@ -36,35 +42,47 @@ main_dtb = None
# Entry.ProcessContentsUpdate()
allow_entry_expansion = True
def GetFdt(fname):
"""Get the Fdt object for a particular device-tree filename
# Don't allow entries to contract after they have been packed. Instead just
# leave some wasted space. If allowed, this is detected and forces a re-pack,
# but may result in entries that oscillate in size, thus causing a pack error.
# An example is a compressed device tree where the original offset values
# result in a larger compressed size than the new ones, but then after updating
# to the new ones, the compressed size increases, etc.
allow_entry_contraction = False
def GetFdtForEtype(etype):
"""Get the Fdt object for a particular device-tree entry
Binman keeps track of at least one device-tree file called u-boot.dtb but
can also have others (e.g. for SPL). This function looks up the given
filename and returns the associated Fdt object.
entry and returns the associated Fdt object.
Args:
fname: Filename to look up (e.g. 'u-boot.dtb').
etype: Entry type of device tree (e.g. 'u-boot-dtb')
Returns:
Fdt object associated with the filename
Fdt object associated with the entry type
"""
return fdt_files[fname]
value = output_fdt_info.get(etype);
if not value:
return None
return value[0]
def GetFdtPath(fname):
def GetFdtPath(etype):
"""Get the full pathname of a particular Fdt object
Similar to GetFdt() but returns the pathname associated with the Fdt.
Similar to GetFdtForEtype() but returns the pathname associated with the
Fdt.
Args:
fname: Filename to look up (e.g. 'u-boot.dtb').
etype: Entry type of device tree (e.g. 'u-boot-dtb')
Returns:
Full path name to the associated Fdt
"""
return fdt_files[fname]._fname
return output_fdt_info[etype][0]._fname
def GetFdtContents(fname='u-boot.dtb'):
def GetFdtContents(etype='u-boot-dtb'):
"""Looks up the FDT pathname and contents
This is used to obtain the Fdt pathname and contents when needed by an
@ -72,21 +90,40 @@ def GetFdtContents(fname='u-boot.dtb'):
the real dtb.
Args:
fname: Filename to look up (e.g. 'u-boot.dtb').
etype: Entry type to look up (e.g. 'u-boot.dtb').
Returns:
tuple:
pathname to Fdt
Fdt data (as bytes)
"""
if fname in fdt_files and not use_fake_dtb:
pathname = GetFdtPath(fname)
data = GetFdt(fname).GetContents()
if etype not in output_fdt_info:
return None, None
if not use_fake_dtb:
pathname = GetFdtPath(etype)
data = GetFdtForEtype(etype).GetContents()
else:
fname = output_fdt_info[etype][1]
pathname = tools.GetInputFilename(fname)
data = tools.ReadFile(pathname)
return pathname, data
def UpdateFdtContents(etype, data):
"""Update the contents of a particular device tree
The device tree is updated and written back to its file. This affects what
is returned from future called to GetFdtContents(), etc.
Args:
etype: Entry type (e.g. 'u-boot-dtb')
data: Data to replace the DTB with
"""
dtb, fname, entry = output_fdt_info[etype]
dtb_fname = dtb.GetFilename()
tools.WriteFile(dtb_fname, data)
dtb = fdt.FdtScan(dtb_fname)
output_fdt_info[etype] = [dtb, fname, entry]
def SetEntryArgs(args):
"""Set the value of the entry args
@ -120,14 +157,14 @@ def GetEntryArg(name):
def Prepare(images, dtb):
"""Get device tree files ready for use
This sets up a set of device tree files that can be retrieved by GetFdts().
At present there is only one, that for U-Boot proper.
This sets up a set of device tree files that can be retrieved by
GetAllFdts(). This includes U-Boot proper and any SPL device trees.
Args:
images: List of images being used
dtb: Main dtb
"""
global fdt_set, fdt_subset, fdt_files, main_dtb
global output_fdt_info, main_dtb, fdt_path_prefix
# Import these here in case libfdt.py is not available, in which case
# the above help option still works.
import fdt
@ -138,33 +175,82 @@ def Prepare(images, dtb):
# since it is assumed to be the one passed in with options.dt, and
# was handled just above.
main_dtb = dtb
fdt_files.clear()
fdt_files['u-boot.dtb'] = dtb
fdt_subset = set()
output_fdt_info.clear()
fdt_path_prefix = ''
output_fdt_info['u-boot-dtb'] = [dtb, 'u-boot.dtb', None]
output_fdt_info['u-boot-spl-dtb'] = [dtb, 'spl/u-boot-spl.dtb', None]
output_fdt_info['u-boot-tpl-dtb'] = [dtb, 'tpl/u-boot-tpl.dtb', None]
if not use_fake_dtb:
fdt_set = {}
for image in images.values():
fdt_subset.update(image.GetFdtSet())
fdt_subset.discard('u-boot.dtb')
for other_fname in fdt_subset:
fdt_set.update(image.GetFdts())
for etype, other in fdt_set.items():
entry, other_fname = other
infile = tools.GetInputFilename(other_fname)
other_fname_dtb = fdt_util.EnsureCompiled(infile)
out_fname = tools.GetOutputFilename('%s.out' %
os.path.split(other_fname)[1])
tools.WriteFile(out_fname, tools.ReadFile(other_fname_dtb))
other_dtb = fdt.FdtScan(out_fname)
fdt_files[other_fname] = other_dtb
output_fdt_info[etype] = [other_dtb, out_fname, entry]
def GetFdts():
def PrepareFromLoadedData(image):
"""Get device tree files ready for use with a loaded image
Loaded images are different from images that are being created by binman,
since there is generally already an fdtmap and we read the description from
that. This provides the position and size of every entry in the image with
no calculation required.
This function uses the same output_fdt_info[] as Prepare(). It finds the
device tree files, adds a reference to the fdtmap and sets the FDT path
prefix to translate from the fdtmap (where the root node is the image node)
to the normal device tree (where the image node is under a /binman node).
Args:
images: List of images being used
"""
global output_fdt_info, main_dtb, fdt_path_prefix
tout.Info('Preparing device trees')
output_fdt_info.clear()
fdt_path_prefix = ''
output_fdt_info['fdtmap'] = [image.fdtmap_dtb, 'u-boot.dtb', None]
main_dtb = None
tout.Info(" Found device tree type 'fdtmap' '%s'" % image.fdtmap_dtb.name)
for etype, value in image.GetFdts().items():
entry, fname = value
out_fname = tools.GetOutputFilename('%s.dtb' % entry.etype)
tout.Info(" Found device tree type '%s' at '%s' path '%s'" %
(etype, out_fname, entry.GetPath()))
entry._filename = entry.GetDefaultFilename()
data = entry.ReadData()
tools.WriteFile(out_fname, data)
dtb = fdt.Fdt(out_fname)
dtb.Scan()
image_node = dtb.GetNode('/binman')
if 'multiple-images' in image_node.props:
image_node = dtb.GetNode('/binman/%s' % image.image_node)
fdt_path_prefix = image_node.path
output_fdt_info[etype] = [dtb, None, entry]
tout.Info(" FDT path prefix '%s'" % fdt_path_prefix)
def GetAllFdts():
"""Yield all device tree files being used by binman
Yields:
Device trees being used (U-Boot proper, SPL, TPL)
"""
yield main_dtb
for other_fname in fdt_subset:
yield fdt_files[other_fname]
if main_dtb:
yield main_dtb
for etype in output_fdt_info:
dtb = output_fdt_info[etype][0]
if dtb != main_dtb:
yield dtb
def GetUpdateNodes(node):
def GetUpdateNodes(node, for_repack=False):
"""Yield all the nodes that need to be updated in all device trees
The property referenced by this node is added to any device trees which
@ -173,25 +259,32 @@ def GetUpdateNodes(node):
Args:
node: Node object in the main device tree to look up
for_repack: True if we want only nodes which need 'repack' properties
added to them (e.g. 'orig-offset'), False to return all nodes. We
don't add repack properties to SPL/TPL device trees.
Yields:
Node objects in each device tree that is in use (U-Boot proper, which
is node, SPL and TPL)
"""
yield node
for dtb in fdt_files.values():
for dtb, fname, entry in output_fdt_info.values():
if dtb != node.GetFdt():
other_node = dtb.GetNode(node.path)
if for_repack and entry.etype != 'u-boot-dtb':
continue
other_node = dtb.GetNode(fdt_path_prefix + node.path)
#print(' try', fdt_path_prefix + node.path, other_node)
if other_node:
yield other_node
def AddZeroProp(node, prop):
def AddZeroProp(node, prop, for_repack=False):
"""Add a new property to affected device trees with an integer value of 0.
Args:
prop_name: Name of property
for_repack: True is this property is only needed for repacking
"""
for n in GetUpdateNodes(node):
for n in GetUpdateNodes(node, for_repack):
n.AddZeroProp(prop)
def AddSubnode(node, name):
@ -221,15 +314,18 @@ def AddString(node, prop, value):
for n in GetUpdateNodes(node):
n.AddString(prop, value)
def SetInt(node, prop, value):
def SetInt(node, prop, value, for_repack=False):
"""Update an integer property in affected device trees with an integer value
This is not allowed to change the size of the FDT.
Args:
prop_name: Name of property
for_repack: True is this property is only needed for repacking
"""
for n in GetUpdateNodes(node):
for n in GetUpdateNodes(node, for_repack):
tout.Detail("File %s: Update node '%s' prop '%s' to %#x" %
(n.GetFdt().name, n.path, prop, value))
n.SetInt(prop, value)
def CheckAddHashProp(node):
@ -274,3 +370,22 @@ def AllowEntryExpansion():
raised
"""
return allow_entry_expansion
def SetAllowEntryContraction(allow):
"""Set whether post-pack contraction of entries is allowed
Args:
allow: True to allow contraction, False to raise an exception
"""
global allow_entry_contraction
allow_entry_contraction = allow
def AllowEntryContraction():
"""Check whether post-pack contraction of entries is allowed
Returns:
True if contraction should be allowed, False if an exception should be
raised
"""
return allow_entry_contraction

View file

@ -0,0 +1,21 @@
// SPDX-License-Identifier: GPL-2.0+
/dts-v1/;
/ {
#address-cells = <1>;
#size-cells = <1>;
binman {
size = <0xc00>;
u-boot {
};
fdtmap {
};
u-boot-dtb {
};
image-header {
location = "end";
};
};
};

View file

@ -0,0 +1,33 @@
// SPDX-License-Identifier: GPL-2.0+
/dts-v1/;
/ {
#address-cells = <1>;
#size-cells = <1>;
binman {
multiple-images;
first-image {
size = <0xc00>;
u-boot {
};
fdtmap {
};
u-boot-dtb {
};
image-header {
location = "end";
};
};
image {
fdtmap {
};
u-boot {
};
u-boot-dtb {
};
};
};
};

View file

@ -0,0 +1,23 @@
// SPDX-License-Identifier: GPL-2.0+
/dts-v1/;
/ {
#address-cells = <1>;
#size-cells = <1>;
binman {
allow-repack;
section {
size = <0x300>;
u-boot-dtb {
offset = <4>;
};
};
u-boot-spl-dtb {
};
u-boot-tpl-dtb {
};
fdtmap {
};
};
};

View file

@ -0,0 +1,16 @@
/dts-v1/;
/ {
#address-cells = <1>;
#size-cells = <1>;
binman {
u-boot {
};
image-header {
location = "end";
};
fdtmap {
};
};
};

View file

@ -0,0 +1,16 @@
/dts-v1/;
/ {
#address-cells = <1>;
#size-cells = <1>;
binman {
u-boot {
};
fdtmap {
};
image-header {
location = "start";
};
};
};

View file

@ -0,0 +1,16 @@
/dts-v1/;
/ {
#address-cells = <1>;
#size-cells = <1>;
binman {
image-header {
location = "end";
};
u-boot {
};
fdtmap {
};
};
};

View file

@ -0,0 +1,16 @@
/dts-v1/;
/ {
#address-cells = <1>;
#size-cells = <1>;
binman {
u-boot {
};
fdtmap {
};
image-header {
location = "end";
};
};
};

View file

@ -0,0 +1,22 @@
// SPDX-License-Identifier: GPL-2.0+
/dts-v1/;
/ {
#address-cells = <1>;
#size-cells = <1>;
binman {
size = <0xc00>;
allow-repack;
u-boot {
};
fdtmap {
};
u-boot-dtb {
};
image-header {
location = "end";
};
};
};

View file

@ -0,0 +1,20 @@
/dts-v1/;
/ {
#address-cells = <1>;
#size-cells = <1>;
binman {
_testing {
bad-shrink-contents;
};
u-boot {
};
_testing2 {
type = "_testing";
bad-shrink-contents;
};
};
};

View file

@ -0,0 +1,20 @@
// SPDX-License-Identifier: GPL-2.0+
/dts-v1/;
/ {
#address-cells = <1>;
#size-cells = <1>;
binman {
sort-by-offset;
end-at-4gb;
size = <0x800000>;
u-boot {
offset = <0xffff0000>;
};
intel-descriptor {
filename = "descriptor.bin";
};
};
};

View file

@ -0,0 +1,37 @@
// SPDX-License-Identifier: GPL-2.0+
/dts-v1/;
/ {
#address-cells = <1>;
#size-cells = <1>;
binman {
size = <0xe00>;
allow-repack;
u-boot {
};
section {
align = <0x100>;
cbfs {
size = <0x400>;
u-boot {
cbfs-type = "raw";
};
u-boot-dtb {
cbfs-type = "raw";
cbfs-compress = "lzma";
cbfs-offset = <0x80>;
};
};
u-boot-dtb {
compress = "lz4";
};
};
fdtmap {
};
image-header {
location = "end";
};
};
};

View file

@ -0,0 +1,28 @@
// SPDX-License-Identifier: GPL-2.0+
/dts-v1/;
/ {
#address-cells = <1>;
#size-cells = <1>;
binman {
size = <0xc00>;
allow-repack;
u-boot {
};
fdtmap {
};
u-boot2 {
type = "u-boot";
};
text {
text = "some text";
};
u-boot-dtb {
};
image-header {
location = "end";
};
};
};

View file

@ -71,6 +71,10 @@ def run_tests(args):
print(err)
for _, err in result.failures:
print(err)
if result.errors or result.failures:
print('dtoc tests FAILED')
return 1
return 0
def RunTestCoverage():
"""Run the tests and check that we get 100% coverage"""
@ -101,7 +105,8 @@ parser.add_option('-T', '--test-coverage', action='store_true',
# Run our meagre tests
if options.test:
run_tests(args)
ret_code = run_tests(args)
sys.exit(ret_code)
elif options.test_coverage:
RunTestCoverage()

View file

@ -362,6 +362,23 @@ class Node:
value = tools.GetBytes(0, len)
self.props[prop_name] = Prop(self, None, prop_name, value)
def _CheckProp(self, prop_name):
"""Check if a property is present
Args:
prop_name: Name of property
Returns:
self
Raises:
ValueError if the property is missing
"""
if prop_name not in self.props:
raise ValueError("Fdt '%s', node '%s': Missing property '%s'" %
(self._fdt._fname, self.path, prop_name))
return self
def SetInt(self, prop_name, val):
"""Update an integer property int the device tree.
@ -374,7 +391,7 @@ class Node:
prop_name: Name of property
val: Value to set
"""
self.props[prop_name].SetInt(val)
self._CheckProp(prop_name).props[prop_name].SetInt(val)
def SetData(self, prop_name, val):
"""Set the data value of a property
@ -386,7 +403,7 @@ class Node:
prop_name: Name of property to set
val: Data value to set
"""
self.props[prop_name].SetData(val)
self._CheckProp(prop_name).props[prop_name].SetData(val)
def SetString(self, prop_name, val):
"""Set the string value of a property
@ -400,7 +417,7 @@ class Node:
"""
if sys.version_info[0] >= 3: # pragma: no cover
val = bytes(val, 'utf-8')
self.props[prop_name].SetData(val + b'\0')
self._CheckProp(prop_name).props[prop_name].SetData(val + b'\0')
def AddString(self, prop_name, val):
"""Add a new string property to a node
@ -481,29 +498,35 @@ class Fdt:
Properties:
fname: Filename of fdt
_root: Root of device tree (a Node object)
name: Helpful name for this Fdt for the user (useful when creating the
DT from data rather than a file)
"""
def __init__(self, fname):
self._fname = fname
self._cached_offsets = False
self.phandle_to_node = {}
self.name = ''
if self._fname:
self.name = self._fname
self._fname = fdt_util.EnsureCompiled(self._fname)
with open(self._fname, 'rb') as fd:
self._fdt_obj = libfdt.Fdt(fd.read())
@staticmethod
def FromData(data):
def FromData(data, name=''):
"""Create a new Fdt object from the given data
Args:
data: Device-tree data blob
name: Helpful name for this Fdt for the user
Returns:
Fdt object containing the data
"""
fdt = Fdt(None)
fdt._fdt_obj = libfdt.Fdt(bytes(data))
fdt.name = name
return fdt
def LookupPhandle(self, phandle):
@ -551,6 +574,8 @@ class Fdt:
parts = path.split('/')
if len(parts) < 2:
return None
if len(parts) == 2 and parts[1] == '':
return node
for part in parts[1:]:
node = node.FindNode(part)
if not node:
@ -670,6 +695,14 @@ class Fdt:
node = Node(fdt, parent, offset, name, path)
return node
def GetFilename(self):
"""Get the filename of the device tree
Returns:
String filename
"""
return self._fname
def FdtScan(fname):
"""Returns a new Fdt object"""
dtb = Fdt(fname)

View file

@ -43,12 +43,14 @@ def fdt_cells_to_cpu(val, cells):
out = out << 32 | fdt32_to_cpu(val[1])
return out
def EnsureCompiled(fname, capture_stderr=False):
def EnsureCompiled(fname, tmpdir=None, capture_stderr=False):
"""Compile an fdt .dts source file into a .dtb binary blob if needed.
Args:
fname: Filename (if .dts it will be compiled). It not it will be
left alone
tmpdir: Temporary directory for output files, or None to use the
tools-module output directory
Returns:
Filename of resulting .dtb file
@ -57,8 +59,12 @@ def EnsureCompiled(fname, capture_stderr=False):
if ext != '.dts':
return fname
dts_input = tools.GetOutputFilename('source.dts')
dtb_output = tools.GetOutputFilename('source.dtb')
if tmpdir:
dts_input = os.path.join(tmpdir, 'source.dts')
dtb_output = os.path.join(tmpdir, 'source.dtb')
else:
dts_input = tools.GetOutputFilename('source.dts')
dtb_output = tools.GetOutputFilename('source.dtb')
search_paths = [os.path.join(os.getcwd(), 'include')]
root, _ = os.path.splitext(fname)

View file

@ -9,7 +9,9 @@ from __future__ import print_function
from optparse import OptionParser
import glob
import os
import shutil
import sys
import tempfile
import unittest
# Bring in the patman libraries
@ -77,11 +79,16 @@ class TestFdt(unittest.TestCase):
"""Test the GetNode() method"""
node = self.dtb.GetNode('/spl-test')
self.assertTrue(isinstance(node, fdt.Node))
node = self.dtb.GetNode('/i2c@0/pmic@9')
self.assertTrue(isinstance(node, fdt.Node))
self.assertEqual('pmic@9', node.name)
self.assertIsNone(self.dtb.GetNode('/i2c@0/pmic@9/missing'))
node = self.dtb.GetNode('/')
self.assertTrue(isinstance(node, fdt.Node))
self.assertEqual(0, node.Offset())
def testFlush(self):
"""Check that we can flush the device tree out to its file"""
fname = self.dtb._fname
@ -421,6 +428,32 @@ class TestProp(unittest.TestCase):
self.dtb.Sync(auto_resize=True)
self.assertTrue(dtb2.GetContents() != self.dtb.GetContents())
def testMissingSetInt(self):
"""Test handling of a missing property with SetInt"""
with self.assertRaises(ValueError) as e:
self.node.SetInt('one', 1)
self.assertIn("node '/spl-test': Missing property 'one'",
str(e.exception))
def testMissingSetData(self):
"""Test handling of a missing property with SetData"""
with self.assertRaises(ValueError) as e:
self.node.SetData('one', b'data')
self.assertIn("node '/spl-test': Missing property 'one'",
str(e.exception))
def testMissingSetString(self):
"""Test handling of a missing property with SetString"""
with self.assertRaises(ValueError) as e:
self.node.SetString('one', 1)
self.assertIn("node '/spl-test': Missing property 'one'",
str(e.exception))
def testGetFilename(self):
"""Test the dtb filename can be provided"""
self.assertEqual(tools.GetOutputFilename('source.dtb'),
self.dtb.GetFilename())
class TestFdtUtil(unittest.TestCase):
"""Tests for the fdt_util module
@ -514,10 +547,23 @@ class TestFdtUtil(unittest.TestCase):
self.assertEqual(0x12345678, fdt_util.fdt_cells_to_cpu(val, 1))
def testEnsureCompiled(self):
"""Test a degenerate case of this function"""
"""Test a degenerate case of this function (file already compiled)"""
dtb = fdt_util.EnsureCompiled('tools/dtoc/dtoc_test_simple.dts')
self.assertEqual(dtb, fdt_util.EnsureCompiled(dtb))
def testEnsureCompiledTmpdir(self):
"""Test providing a temporary directory"""
try:
old_outdir = tools.outdir
tools.outdir= None
tmpdir = tempfile.mkdtemp(prefix='test_fdt.')
dtb = fdt_util.EnsureCompiled('tools/dtoc/dtoc_test_simple.dts',
tmpdir)
self.assertEqual(tmpdir, os.path.dirname(dtb))
shutil.rmtree(tmpdir)
finally:
tools.outdir= old_outdir
def RunTestCoverage():
"""Run the tests and check that we get 100% coverage"""

View file

@ -9,6 +9,7 @@ import command
import glob
import os
import shutil
import struct
import sys
import tempfile
@ -82,6 +83,7 @@ def FinaliseOutputDir():
"""Tidy up: delete output directory if temporary and not preserved."""
if outdir and not preserve_outdir:
_RemoveOutputDir()
outdir = None
def GetOutputFilename(fname):
"""Return a filename within the output directory.
@ -100,6 +102,7 @@ def _FinaliseForTest():
if outdir:
_RemoveOutputDir()
outdir = None
def SetInputDirs(dirname):
"""Add a list of input directories, where input files are kept.
@ -377,7 +380,7 @@ def ToBytes(string):
return string.encode('utf-8')
return string
def Compress(indata, algo):
def Compress(indata, algo, with_header=True):
"""Compress some data using a given algorithm
Note that for lzma this uses an old version of the algorithm, not that
@ -408,9 +411,12 @@ def Compress(indata, algo):
data = Run('gzip', '-c', fname, binary=True)
else:
raise ValueError("Unknown algorithm '%s'" % algo)
if with_header:
hdr = struct.pack('<I', len(data))
data = hdr + data
return data
def Decompress(indata, algo):
def Decompress(indata, algo, with_header=True):
"""Decompress some data using a given algorithm
Note that for lzma this uses an old version of the algorithm, not that
@ -428,6 +434,9 @@ def Decompress(indata, algo):
"""
if algo == 'none':
return indata
if with_header:
data_len = struct.unpack('<I', indata[:4])[0]
indata = indata[4:4 + data_len]
fname = GetOutputFilename('%s.decomp.tmp' % algo)
with open(fname, 'wb') as fd:
fd.write(indata)
@ -473,3 +482,19 @@ def RunIfwiTool(ifwi_file, cmd, fname=None, subpart=None, entry_name=None):
if entry_name:
args += ['-d', '-e', entry_name]
Run(*args)
def ToHex(val):
"""Convert an integer value (or None) to a string
Returns:
hex value, or 'None' if the value is None
"""
return 'None' if val is None else '%#x' % val
def ToHexSize(val):
"""Return the size of an object in hex
Returns:
hex value of size, or 'None' if the value is None
"""
return 'None' if val is None else '%#x' % len(val)

View file

@ -4,16 +4,14 @@
# Terminal output logging.
#
from __future__ import print_function
import sys
import terminal
# Output verbosity levels that we support
ERROR = 0
WARNING = 1
NOTICE = 2
INFO = 3
DEBUG = 4
ERROR, WARNING, NOTICE, INFO, DETAIL, DEBUG = range(6)
in_progress = False
@ -87,7 +85,7 @@ def _Output(level, msg, color=None):
ClearProgress()
if color:
msg = _color.Color(color, msg)
_stdout.write(msg + '\n')
print(msg)
def DoOutput(level, msg):
"""Output a message to the terminal.
@ -105,7 +103,7 @@ def Error(msg):
Args:
msg; Message to display.
"""
_Output(0, msg, _color.RED)
_Output(ERROR, msg, _color.RED)
def Warning(msg):
"""Display a warning message
@ -113,7 +111,7 @@ def Warning(msg):
Args:
msg; Message to display.
"""
_Output(1, msg, _color.YELLOW)
_Output(WARNING, msg, _color.YELLOW)
def Notice(msg):
"""Display an important infomation message
@ -121,7 +119,7 @@ def Notice(msg):
Args:
msg; Message to display.
"""
_Output(2, msg)
_Output(NOTICE, msg)
def Info(msg):
"""Display an infomation message
@ -129,7 +127,7 @@ def Info(msg):
Args:
msg; Message to display.
"""
_Output(3, msg)
_Output(INFO, msg)
def Detail(msg):
"""Display a detailed message
@ -137,7 +135,7 @@ def Detail(msg):
Args:
msg; Message to display.
"""
_Output(4, msg)
_Output(DETAIL, msg)
def Debug(msg):
"""Display a debug message
@ -145,7 +143,7 @@ def Debug(msg):
Args:
msg; Message to display.
"""
_Output(5, msg)
_Output(DEBUG, msg)
def UserOutput(msg):
"""Display a message regardless of the current output level.