Python's distutil's copy tree code maintains a cache of directories
created so deleting a tree a different way then coping the same
tree results in an error because the destination folders in the
tree are not present because distutils thinks they exist. The
solution is to implement a copy tree function.
This changes adds support to build the autotools if the host installed
version is not a suitable version. Autoconf and automake have hard coded
references to the install prefix and host tools and this makes it impossible
to relocate, that is use in any path other than the install prefix. To
bootstrap automake you need to first build a suitable autoconf and with that
you can built automake for the install prefix. The other complication is
not referencing the install prefix in the path when building in the RSB.
Having the install prefix in the path can result in strange issues appearing
such as gcc using a new assembler feature not present in an older assember
installed under the install prefix.
The process is to build the autotools using an install prefix to an
internal path inside the RSB temporary path and to use that autoconf
to build the version for the install prefix. The internal install
prefix version is also used to bootstrap RTEMS.
Refactor the reporter to allow the setbuilder to use its build config
rather than regenerating the configuration from the configuration file.
Using the config file and the build macros exposed an issue if a
macro was undefined that was defined in a build set above the
config file. Using the build set's configuration as used to build
is a better solution.
The reporter was refactored to allow a config class to be used
to report.
The setbuild can now take a configuration file as an input file.
Added a check in the options post processing to check is the
prefix path allows writes. No actual write check is made. just
the permissions are checked. If the --no-install options is
used the check is not made.
Moved the --no-install option from the set builder to the options
module.
To support building snapshots and pre-release source the defaults
has been refactored. The defaults have been moved to a stand alone
file and a macros.py module added. This modile abstracts the
old default dictionary turning it into a class. The macros
class can load macros from a file therefore the defaults have
been moved to a stand alone file.
The use of defaults has been removed from the project. The only
case where it is used in the options where the defaults are read
from a file. Macros are used everywhere now.
The defaults.py has been moved to the option.py and the separate
options and defaults values has been moved to a new pattern. When
constructing an object that needs macros and options if the macros
passed in is None the defaults from the options are used. This makes
it clear when the defaults are being used or when a modified set of
macros is being used.
The macros class support maps. The default is 'global' and where all
the defaults reside and where configuratiion file changes end up.
Maps allow macros to be read from a file and override the values
being maintained in the 'global' map. Reading a macro first checks
the map and if not present checks the 'global' map.
The addition of maps to the macros provides the base to support
snapshots and pre-release testing with standard configurations.
This functionality needs to be added. It works by letting to
specify a snapshot with:
source0: none, override, 'my-dist.tar.bz2'
and it will be used rather the value from the standard configuration.
With a build set you need to also specify the package these macros
are for. The maps provide this.
Refactor the options handling in defaults.py to allow the --jobs
option have varing specific parameters. The option supports 'none',
'max' and 'half' or a fraction to divide the number of CPUs or
an integer value which is the number of jobs. The --no-smp has
been removed.
The host specific modules have been changed to set the number of
CPUs in the defaults table.
Fixed the --keep-going to clean up is --always-clean is provided
even if the build has an error.
Add support to build MinGW tools using Cygwin. This is a Canadian cross
build.
Do not expand the directives when parsing a configuration file. Hold
in the package object the text as read from the configuration file. Still
parse the logic but leave the macros. This allows a configuration to be
varied when the build happens. The Canadian cross uses this to build a
build compiler used to build a Cxc runtime.
Add Cxc support to the build module. In the defaults add rm and rmfile
macros, add Cxc paths and pre-build script code.
In the setbuilder check for a Cxc build and if so and the package
allow Cxc build the build host version then the host target
version.
Add cygiwn support to the defaults processing and to the Windows module.
When using the set builder and nesting builds prpvide the nested
set builder and build objects with copies of the master defaults.
Python's variable sharing was sharing a single set of defaults
across all build sets and this resulted in popluted configurations.
Autoconf hard codes paths into itself. This change is a first
pass at allowing a clean environment to let automake build. The
ability to 'make install DESTDIR=xxx' autoconf then use it to
build automake needs a clean environment. The purpose is to
allow a prefix that needs root without building and packaging
when root.
By default the Source Builder now directly installs in the prefix and
does not create tar files. You need to supply options to create build
set level tar files and/or package level tar files.
A build set can invoke another build set. This allows an 'all'
type build set that builds all the RTEMS archs.
Change the get config call to return a map of paths and files.