2.2. Administrative Classes

2.2.1.  Host

The class Host manages the entire image processing in the ML including paging, caching, parallelization and calling any Module::calculate* functionality. It also provides functions such as checking and updating module graphs as well as calculating (sub)images with getTile() commands.

The Host processes a set of ML modules that are derived from the class Module and connected as a directed acyclic graph.

[Note]Note

Do not try to use the Host directly by using its methods or functions. All of its important functionality is wrapped as static functions in Module. The Host should remain a "hidden" part of the ML so that Host replacements and improvements do not endanger module compatibility.

For memory-optimized calculation of an image or subimage, each ML module (derived from Module) supports so-called page-based image processing, i.e., each image is divided into blocks (called pages) which are then processed sequentially or in parallel and are finally combined to the requested (sub)image.

Consequently, the memory usually does not hold the complete image, but only the currently used fragments.

The page extent is defined as an image property of each ML module output. Page-based image processing can degenerate to global image processing when the page extent is set to the extent of the actual image. This, however, is the worst case and should be avoided.

During image processing, the ML stores as many pages as possible in the cache of the MLMemoryManager (Section 1.3.2.2, “The MLMemoryManager and Memory Handling”) to reach maximum performance. Repeated (sub)image requests can often be processed more efficiently by simply reusing existing pages. The extent of pages can be controlled by the application or by the user.

Overview of image requests performed by the Host:

1. Viewer shows image properties and data from Filter.

2. Filter calculates results from image properties and data from Load.

The Host remains invisible to the module but processes all Viewer requests. Functions (e.g., getTile()) are wrapped in the class Module:

Figure 2.1. Requesting an Image Pipeline with getTile()

Requesting an Image Pipeline with getTile()

When a tile is requested, the tile request is broken down into page request:

Figure 2.2. Page-Based getTile() (I)

Page-Based getTile() (I)

The Host calculates a tile with getTile() as follows:

  • Allocate memory for the tile.

  • Calculate the pages that are intersected by the tile.

  • For all pages:

    • Already in cache? Yes => Done.

    • Not in cache? => Allocate page in cache.

    • Request to Filter with calculateInputSubImageBox(): Determine input area needed to calculate the page.

    • Allocate and calculate the input tiles by recursively calling getTile().

  • Call calculateOutputSubImage() in Filter.

  • Copy pages to tile.

Figure 2.3. Page-Based getTile (II)

Page-Based getTile (II)

2.2.2.  Memory

The ML class Memory provides functions for memory allocation, reallocation, freeing, etc. Currently, only basic functionality is available; however, future versions will use automatic strategies to (re)organize and/or clean up memory (and the ML cache) to reduce or avoid out-of-memory errors.

[Important]Important

If possible, always try to use the memory handling functionality of this class when you need to allocate your own memory.

This class can automatically handle memory errors and will support correct and safe memory handling in the future.

[Note]Note

You can use the mlAPI functionality instead of the Memory class. The mlAPI functionality uses - of course - the Memory class.

The following class functionality is currently available:

  1. static void* allocateMemory(MLuint size, MLMemoryErrorHandling handleFailure);

    Allocates a memory block of size bytes.

  2. static void* reallocateMemory(void* ptr, MLuint size, MLMemoryErrorHandling handleFailure);

    The memory block pointed to by ptr is resized and copied so that it has at least size bytes.

  3. static void freeMemory(void* ptr);

    Frees memory that has been allocated with any Memory function. NULL pointers may be passed safely; they are simply ignored.

  4. static void* duplicateMemory(const void *ptr, MLuint size, MLMemoryErrorHandling handleFailure);

    Copies the memory pointed to by src of size size in a newly allocated buffer which must be freed by the caller with freeMemory(). If ptr is passed as NULL, NULL is returned without any error handling.

  5. static char* duplicateString(const char *ptr, MLMemoryErrorHandling handleFailure);

    Copies the passed null-terminated string str in a newly allocated buffer which must be freed by the caller with freeMemory().

[Note]Note

Always use functions of the class Memory in ML contexts so that the ML can optimize memory usage and provide safer memory allocations.

The parameter handleFailure determines the function behavior in error cases:

  1. ML_RETURN_NULL

    If memory allocation fails, NULL is returned without error handling. The programmer must take care of the error.

  2. ML_FATAL_MEMORY_ERROR

    If memory allocation fails, ML_PRINT_FATAL_ERROR() with error code ML_NO_MEMORY is called; NULL is returned if ML_PRINT_FATAL_ERROR() has been returned. The programmer does not need to take care of the error case, because the ML handles it.

  3. ML_THROW_NO_MEMORY

    If memory allocation fails, throw(ML_NO_MEMORY) is executed. The programmer could implement something like

    Example 2.4. Using Exceptions when Allocating Memory with MLThrowNoMemory

    ML_START_NAMESPACE
    
      try {
    
        // Try to allocate...
    
        Memory::allocateMemory(1000, ML_THROW_NO_MEMORY);
      }
      catch(MLErrorCode)
      {
        // Handle error if memory could not be allocated.
      }
    
    ML_END_NAMESPACE

Note that these error handling cases will only occur if the Memory class functionality has no chance to allocate the required memory. In future versions, the following might happen: The first internal allocation fails, but the Memory class clears the ML cache and successfully retries memory allocation. In those cases, none of the above error cases will be used.

2.2.3.  Base

The ml::Base class is a base class of many ML classes and is designed for all objects passed between different ML modules via the so-called Base fields. Thus it is possible to establish transfer of arbitrary data types between modules.

2.2.4. The Runtime Type System

The ML provides a so-called Runtime Type System for managing all important classes available in the module database and in the ML.

  • Runtime

    This class contains the global runtime type system of the ML. It manages a dictionary of runtime types and can create and remove runtime types. This class only contains static components and must be initialized with init() and destroyed with destroy().

  • RuntimeType

    This class contains runtime-generated type and inheritance information of associated classes. To track this information, the macros defined in mlRuntimeSubClass.h have to be inserted in the declaration and implementation of the associated classes.

  • RuntimeDict

    This class manages a set of instances of the class RuntimeType. The class Runtime uses one global instance of this class for the runtime type system of the ML.

The file mlRuntimeSubClass.h also includes important and frequently used macros.

  • ML_BASE_IS_A(base,type)

    This macro is used to check whether the given Base pointer is of the expected type: ML_BASE_IS_A(base, MarkerExample).

[Note]Note

The macro ML_BASE_IS_A should be replaced by the explicit cast mlbase_cast<BaseTpe*>(object) introduced in the ML version since MeVisLab 2.0.

One of the following macros in the header implementation of a class derived from Base must be used.

Each of these macros implements the interface to the runtime type system of the derived class.

  • ML_CLASS_HEADER(className)

    This macro must be included in the header of a non-abstract class to declare some additional methods described below.

  • ML_MODULE_CLASS_HEADER(ModuleClassName)

    This macro must be included in the header of a class derived from the class Module to declare some additional methods described below.

  • ML_ABSTRACT_CLASS_HEADER(className)

    This macro must be included in the header of an abstract class to declare some additional methods described below.

One of the following macros in the source file implementation of a class derived from Base must be used.

Each of these macros implements the interface to the runtime type system of the derived class.

  • ML_CLASS_SOURCE(className, parentName)

    This macro must be included in the source file of a non-abstract class to implement the methods declared with ML_CLASS_HEADER.

  • ML_MODULE_CLASS_SOURCE(className, parentModule)

    This macro must be included in the source file of classes derived from the class Module to implement the methods declared with ML_MODULE_CLASS_HEADER. Module implements protected constructors and assignment operators to avoid the assignment of Module modules to themselves. The normal ML_CLASS_SOURCE macros cannot be used.

  • ML_ABSTRACT_CLASS_SOURCE(className,parentName)

    This macro must be included in the source file of an abstract class to implement the methods declared with ML_ABSTRACT_CLASS_HEADER.

  • ML_MODULE_CLASS_SOURCE_EXT(className, parentModule, superClassConstructs)

    This macro is an alternative to ML_MODULE_CLASS_SOURCE if the constructor of the parentModule does not have two parameters or if other members need to be initialized (e.g. constants). The third parameter superClassConstructs permits the specification of the correct constructor call of the superclass, e.g., ML_MODULE_CLASS_SOURCE_EXT(MyFilter, MyParentModule, :MyParentModule()) does not pass parameters for the superclass constructor that is used in the normal Module.

    If you need to pass more complex expressions as third parameters, such as superclass or member initializers (as a comma-separated list, for example), use the following trick:

    Example 2.5. How to Use the Macro ML_MODULE_CLASS_SOURCE_EXT

    // Stuff to do for base classes when copy constructor is implemented
    // (which is done in a macro to have a private and not an executable
    // copy constructor).
    
    #define _INIT_STUFF : Module(0,0), _initMember1(initValue1), \
                                       _initMember2(initValue2)
    
    
    // This macro declares some automatically generated functions and methods
    // for the runtime system and for the initialization of this class. It
    // implements more elaborated superclass and member initializers given
    // by _INIT_STUFF.
    
    ML_MODULE_CLASS_SOURCE_EXT(MyNewModule, Module, _INIT_STUFF)
    
    #undef _INIT_STUFF


See also the file mlLibraryInitMacros.h which does not directly belong to the runtime type system but which contains macros for the initialization after runtime linking to the library. It permits the implementation of a function in a library where module classes and runtime types can be initialized directly after linking to the library.

2.2.5. Debugging and Error Handling Support

See Chapter 5, Debugging and Error Handling for detailed information on concept, classes and macros for error handling and debugging.