.: Home > Services > Data Management 

 Data Management

As Organizations grow the level of heterogeneity across systems increases. This is true whether you grow internally or grow through acquisition. The ability to share or unify information decreases. The effort required to make decisions from comprehensive information skyrockets. Organizations settle for doing what they can. Over the years a number of companies have developed ETL (Extract, Transfer, Load) tools in an attempt to resolve some of these "Islands of Information" issues. Through the acquisition and deployment of these tools Organizations have discovered that there are a large family of issues arising from the heterogeneous set of systems, databases, data models, requirements, and collection facilities. This family includes :

  • The same data types on different platforms are often very different - often in several ways,
  • Different data types do not always have uniform mappings,
  • Related attributes in different systems may often vary in their actual semantics,
  • Attributes of identical intended semantics on different systems often vary in their actual semantics,
  • Data quality management varies across systems (in focus, definition, mechanisms, and results),
  • Unifying Data Quality metrics and remedial measures has a family tree of its own,
  • Moving very large volumes of data results in very large issues of time, processing capacity, resource allocation, parallelism, fault tolerance (check pointing), and scalability,
  • What ETL operations are performed on what platforms is more important than it should be,
  • These ETL operations require more time, human resources, expertise, and maintenance than they should,
  • Changes on any one system are frequent and disruptive,
  • Isn't all of the intelligence embedded in this work/system organizational meta-data?
  • This 'derived' meta-data should be brought to bear on the problem,
  • All of this is more than humans should be expected to address effectively.

We take a "collective problem solving approach" to data management. We achieve this through Ab Initio's (ab initio, Latin "from the beginning") collection of software products. The Ab Initio Co-Operating system provides a layer over the operating system (MVS, OS390, UNIX, Win2000/NT) that renders all platforms homogeneous from a runtime perspective. The Ab Initio Graphical Development Environment provides the high level graphical interface for designing/defining the entire dataflow (Source(s), Operation(s), Target(s), and their connections). The GDE enables data engineers to focus on the conceptual model. The underlying software addresses the low level mechanics. Physical data objects serving as Sources and Targets can be virtually anything from flat files, to ISAM files, Databases (virtually any relational, network, hierarchical), SAS data object, Object store, Excel files, ..., almost anything. The Operations are supported by a very large and comprehensive set of 'out-of-the-box' operations. Any operation one wishes to build can be built and used as an Operation. It does not matter what language it is used. A co-operating system wrapper can be applied to the operation and it can run "anywhere". Since the GDE is "data aware" (for all Sources and Targets), any change to any attribute in any table of any Source or Target is automatically registered by the GDE in its internal data management log and the appropriate runtime component(s) can be generated. The GDE is also heavily parameter driven. These parameters provide all of the configuration information.

The runtime products of the GDE can be run on any platform running the Co-Operating system. They can be instructed to run in parallel, on specific CPU, or specific assignment criteria (load balancing, round-robin, ...). The executable dataflow objects are fully scalable. Data movement, transformations, defect pooling, defect reporting, accepted data capture, are supported through the Operations.

The system wide meta-data is automatically generated and maintained in a current state through access to a common meta-data repository that is populated and used by many of the Ab Initio software components. The interface provides an efficient method for change impact analysis, both a priori and a postiori.

We can resolve the "Islands of Information" for virtually any organization.


Capabilities Summary

Our Process - A Summary

Our People - the Team




Solutions Engineering Corporation
5149 Westbard Avenue
Bethesda, MD 20816
© 2005 Solutions Engineering Corporation
Voice (240) 432-3798
Fax (202) 330-5753