Automated Architecture Checking of UML Based SoC Specifications
By R. Deaves, S. Eaton-Rosen*, M. Habets, S. Ryan, L. Rattin and A. Jones
STMicroelectronics
* Summer Intern Student from Bath University, United Kingdom
Abstract:
This paper documents work that has been carried out to automate the checking of system software architecture specifications of SoCs. Checking the architecture manually can be time consuming and prone to error. This is especially true on architectures that are large and complex. To achieve automation the architecture specification methodology is first represented as UML meta-models to highlight associations that can be checked. These are then analyzed to determine the checks to be carried out. The analysis indicated the checks relating to constructional, completeness, consistency and connectivity could be achieved.
The checks were realized and demonstrated on an architecture CASE tool and detected artificially introduced omissions and inconsistencies in a realistic architecture.
1. Introduction
Consumer electronic SoC devices are becoming larger and more complex. This is driven by (i) the consumer requirement for more features, (ii) the box manufacturer’s pursuit of lower cost through integration, and (iii) smaller silicon feature sizes that provides a higher density of transistors.
This increase in complexity percolates upwards to the domain of system architecture. Here a number of methodologies have been developed to deal with this complexity. This work includes using a more formal specification methodology.
However, as the size and complexity of the architectures increases the capability to provide a consistent and complete architecture becomes difficult to achieve manually. This is especially true when developments, enhancements and changes are introduced late in the architecture process.
One way to alleviate this problem is to introduce automatic checking of the architecture specification. This is achieved by having the architecture CASE tool check the specification and highlighting breakdowns. This is the subject matter of this manuscript.
The remainder of the paper is organized as follows: Section 2 provides the motivation for the work. Section 3 gives a literature review of the work of others on architecture checking. The framework for the investigation is detailed in Section 4. The paper continues by providing the meta-models for the formal software architecture specification in Section 5. Analysis of the meta-models identifies the most pertinent associations to be checked in Section 6. Section 7 provides details of the realization of the checks. Concluding remarks on this work are provided in Section 8.
2. Motivation
ST has been developing a formal methodology to specify the system software architecture of their SoCs. This work included the application of different ‘Views’ to partition the specification, and ‘UML’ to realize those views [1]. In addition, the applications of our formal methodology has been demonstrated/documented [2] along with analysis of the architecture patterns generated [3].
Fig 1: Previous v Current Device
Recent analyses by comparison of our previous and current based specifications are provided in Fig 1.
Here the number of models used in specifying the system software architecture is summarized. The results indicate that the complexity of the devices architected are increasing as indicated by the number of sub-systems integrated (2 in the previous device and 5 in the current product). Further, the numbers of Use Case, Component and Deployment models have all increased. This leads to an increased effort in checking the specification if performed manually.
The work presented in this paper addresses this issue by automating the checking of our system software architecture specification.
3. Background
This section reviews previous work carried-out in the area of architecture checking.
In [4], a graphical representation of architecture constraints is defined to provide a formal framework. This is developed to provide a semantic representation in predicate logic that can be checked by a simple algorithm to highlight consistency violations. The framework is validated on a test scenario based on a train control system. An important aspect of this work is to apply consistency checking not only within a particular view (intra-view) but also between different views (inter-view).
The work presented in [5] use meta-models that to define UML models that can be checked. The paper also describes a tool, the Model Consistency Checker (MCC) that enhances this capability.
The excellent book by Hofmeister et al [6] defines a Software Architecture methodology which has been used as a basis of the ST formal System Software Architecture specification framework [1], [2] and [3]. Here, the views defined in [6] are formally represented by meta-models. This clarifies what checks can be carried-out.
4. Investigation Framework
The investigation represented in this paper uses many aspects of the work reviewed. These areas are summarized below:
- Meta-models of the ST formal System Software Architecture methodology will be developed.
- The meta-models will be analyzed to highlight potential checks that will work on intra- and inter- view representations.
- The checks identified will be realized and demonstrated in our current UML modeling tool [7].
5. Methodology Meta-Models
This section of the paper provides the meta-models for the formal methodology adopted for system software architecture specification in ST.
5.1 Top-Level Meta-Model
Fig 2 represents the top level meta-model. This comprises 5 different views.
Fig 2: Top-Level Meta-Model
Fig 3: Requirements View Meta-Model
In summary, the Requirements View document the use cases associated with the specification. Each use case is decomposed into a component model comprising a number of individual components in the Conceptual View. These components do not partition what is carried-out in hardware and software. The Hardware View provides a representation of the SoC IPs and interconnects. The Implementation View takes the Conceptual View and Hardware View to provide a realization of the specification. This is the view that defines what software is required and on what CPU it executes. The Delivery View summarizes the software components and provides estimates of CPU loading, memory bandwidth/size and concurrent (stressing) use case performance.
5.2 Requirements View Meta-Model
The meta-model for the Requirements View is represented in Fig 3.
The Requirements View identifies the Boundary of the System. Generally, this consists of the Box in which the SoC will be included. In addition, the Actors of the System are identified. The Use Cases of the System are grouped to related areas. Each Group is represented by Actors and Use Case Diagrams. The Use Cases may be Associated by relationships including (not exclusively), uses, extend and include.
5.3 Conceptual View Meta-Model
The meta-model for the Conceptual View is represented in Fig 4.
The Conceptual View takes the Use Cases and represents them as Components, Ports and Connectors. Here the HW/SW partitioning has not been identified. Dynamic Models are used to enhance the description. This view is used as input to the Implementation View.
Fig 4: Conceptual View Meta-Model
5.4 Hardware View Meta-Model
The meta-model for the Hardware View is represented in Fig 5.
The Hardware View identifies all the IPs (represented as Deployment Nodes) required to realize the System. These are determined through discussion between the System Software and Hardware Architects. These include IPs that are Pure HW, Firmware Driven HW and OS CPU Driven HW. This is used as input to the Implementation View.
Fig 5: Hardware View Meta-Model
5.5 Implementation View Meta-Model
The meta-model for the Implementation View is represented in Fig 6.
Fig 6: Implementation View Meta-Model
The Implementation View takes the Hardware View and Conceptual View to provide the Deployment Models, i.e. what Software Modules execute on what CPU.
5.6 Delivery View Meta-Model
The meta-model for the Delivery View is represented in Fig 7.
Fig 7: Delivery View Meta-Model
The Delivery View summarizes the Software Modules associated with each CPU and the Performance Estimates for identified concurrent use cases.
This provides a number of useful System Level metrics. These include:
- Identifies all the software modules required along with their status. This is used by Project Management for cost and planning and by the detailed Architecture/Design teams as boundaries for their work.
- Provides performance metrics for the Marketing Teams to ensure that their requirement can be met.
5.7 Inter-View Meta-Model
To complete the meta-model a representation of the inter-view associations is provided in Fig 8.
Fig 8: Inter-view Meta Model
In this representation each View is represented vertically, i.e. a column. Each of the meta-classes in that view, that has an association, is included on the diagram.
6. Meta-Model Analysis
In this section of the paper the system specification meta-models are analyzed to determine what checks can be carried out.
It should be noted that the nature or role of the associations provided in the meta-models of the previous section have been restricted to one of the following set {Describes, Uses, Associates, Hierarchy}. This is a somewhat limited set providing more generic roles than specific. This was an intentional choice for this manuscript to aid with the meta-model analysis. As an example of a specific role, considering the Inter-view Meta Model of Fig 8, the association between the Use Case Diagram and Component Model could be stated as Generates, as each Use Case Diagram generates an individual Component Model.
Using the restricted set of roles and the UML composition notation, associations highlighting a variety of checks can be made. Fig 9 provides a representation of checks, that if automated would make the generating and maintaining the architecture specification less time consuming and less prone to error.
Fig 9: Architecture Specification Checking
The main check areas identified are listed below:
- Construction: This check aims to automate checking to ensure that all the sections required for the specification are included and appropriately represented. A simple example of this would be to ensure that all the Views identified are represented.
- Completeness: This test aims to determine if any parts of the architecture specification has been omitted. A simple example of this check would be to ensure that descriptive entries have been made.
- Connectivity: Here the connectivity of the models is checked. As an example, if a component model has a port which is not used then this should be highlighted.
- Consistency: This check ensures that there is consistency in the specification. For example, this check ensures that each use case model identified has a component model.
7. Checking Realization
This section of the paper documents how the checks identified are realized in an architecture CASE tool. Due to the brevity of the paper only Construction and Connectivity checks are documented as examples.
7.1 Enterprise Architect
The tool used for this work is Enterprise Architect (EA) [7]. This is a very powerful architecture tool providing UML support (along with other models), automatic documentation generation (rtf and html) and integration with other tools (e.g. IBM Doors). Further, and pertinent to the work presented here EA also provides mechanisms for checking the architecture. Primarily, there are two methods: (i) the use of SQL to query the architecture representation, and (ii) provision for executing scripts. These capabilities provide a powerful paradigm for testing the architecture specification developed. For the investigation documented in this paper java-scripts are used to realize the checks.
7.2 Construction Check
Fig 10 represents the Project Browser of EA defining the system architecture for a Gateway SoC.
Fig 10: Gateway Architecture Specification
In order to stimulate the check the Requirements and Delivery View names are changed to mimic their omission. The updated project view is represented in Fig 11.
Fig 11: Modified Gateway Architecture
The check is carried-out by executing the check script through a window on EA. This is represented in Fig 12.
Fig 12: Scripting Window
The output of running the script is provided in an EA window, see Fig 13.
Fig 13: Script Output
This demonstrates that the script has successfully detected a problem with the Construction of the specification.
7.3 Connectivity Check
The Connectivity Check documented here is concerned with testing a component model, see Fig 14. Here a note highlights a link that will be removed in order to stimulate the connectivity check. To execute the test the association (<<if16>) was removed and the script run. The output result from the script is provided in Fig 15. This demonstrates that the automated checking has detected a port which is unconnected and needs to be investigated.
Fig 14: Component Model
Fig 15: Unused Port Detection
8. Concluding Remarks
The capability of the architecting CASE tool to automatically check the specification provides a powerful paradigm. This can reduce development time by eliminating the manual resource required to detect and correct omissions, inconsistencies and errors in the specification.
Having detailed documentation of the framework of the architecture specification methodology enables easy identification of checks that can be carried out. Here a meta-model analysis of the ST System Software Architecture specification framework identified checks that could be carried-out to highlight construction, completeness, connectivity and consistency issues.
Realization of the checks identified has been demonstrated on the Enterprise Architect CASE tool that allows the checks to be programmed as scripts. The paper has documented examples of detecting construction and connectivity issues.
Our (near) future work will build on that presented in this paper as follows: (i) to extend our checking capability through development of appropriate scripts, (ii) to apply the checking capability on a ‘real’ project, and (iii) re-apply the checking capability to the System Hardware Architecture Specification.
9. Acknowledgements
The authors would like to acknowledge the technical contributions made to this work by Nico Vermaat, Elisa Cuoco-Vermaat and Peter Stieglitz. In addition, acknowledgments are made for supporting this work to Andrew Cofler, Philippe Ballard, Yann Garnier, Mark Bennett, Olga Skukowska and Bill Fletcher.
10. References
[1] Deaves, R.H. et al., ‘Embedded Software Architecture Specification Developments in Support of SoC Design and Re-use’, IPSOC 08, Dec 2008.
[2] Deaves, R.H. et al., ‘Initial Investigations into UML Based Architectural Reference Patterns in Set-top Boxes’, IPSOC 09, Dec 2009.
[3] Deaves, R.H. et al., ‘A UML Representation of a Passive Standby Power Management Architecture for Set-Top Boxes’, IPSOC 10, Dec 2010.
[4] Fradet, P., et al., ‘Consistency checking for multiple View Software Architecture’, ESEC/FSE’99, 1999.
[5] Simmonds, J., et al., ‘Description Logics for Consistency Checking of Architectural Features in UML 2.0 Models’, Universidad de Chile, 2005.
[6] Hofmeister, C., et al., ‘Applied Software Architecture’, Addison-Wesley, ISDN 0-201-32571-3, 2000.
[7] Enterprise Architect, v9, SparxSystems.
|
Related Articles
- An efficient way of loading data packets and checking data integrity of memories in SoC verification environment
- Product how-to: Reliable SoC bus architecture improves performance
- SoC interconnect architecture considerations
- A UML Representation of a Passive Standby Power Management Architecture for Set-Top Boxes
- Application Driven Network on Chip Architecture Exploration & Refinement for a Complex SoC
New Articles
Most Popular
E-mail This Article | Printer-Friendly Page |