Disclaimer

This document is the copyrighted property of ASAM e.V. Any use is limited to the scope described in the license terms.

Foreword

The ASAM OSI® (Open Simulation Interface) is a specification for interfaces between models and components of a distributed simulation. OSI is strongly focused on the environmental perception of automated driving functions.

The complexity of automated driving functions is rapidly increasing, which means the requirements for testing and development methods are growing too. Testing in virtual environments makes it possible to control and reproduce environment conditions.

To enable the widespread use of driving simulators for developers of functions, generic and standardized interfaces are needed for the connection between the function-development framework and the simulation environment. OSI permits easy and straight-forward compatibility between automated driving functions and the variety of driving-simulation frameworks that are available. OSI addresses the emerging standard ISO 23150 [1] for a real sensors’ standardized communication interface. OSI defines generic interfaces to ensure the modularity, integrability, and interchangeability of the individual components.

The vision is to make it possible to connect any automated driving function to any driving simulator and emerging new hardware sensor generations using a standardized ISO 23150 [1] interface. This will simplify integration, significantly strengthening the accessibility and usefulness of virtual testing in the process.

1. Introduction

1.1. What has changed

Documentation Updates
  • Updated the documentation’s structure and migrated it to AsciiDoc.

  • Updated the description of OSI’s top level messages.

  • Updates on additional descriptions, like those for trace file formats, naming conventions and installation instructions for Linux/Windows.

Technical Updates
  • Extended SensorViewConfiguration message to allow the consumer to choose to not include static content in the SensorView ground truth information.

  • Updated StationaryObject classification enums to avoid "pylon" duplication.

  • Extended StationaryObject classification message to include speed bumps.

  • Extended StationaryObject to include emitting structures of electromagnetic waves (for example this can be used to define a streetlamp).

  • Extended the TrafficSign classification message, by adding attributes country, country_revision, code and sub_code, to better support internationalization and to align with ASAM OpenDRIVE.

  • Updated the color coding message description to better align with ASAM OpenDRIVE.

  • Update the color coding message description to include Grey, RGB, RGBIR, HSV, LUV and thus align with ISO-23150.

  • Added an enum for dashed lane boundaries to disambiguate start and end of those.

  • Extended HostVehcileData with more comprehensive on-board information, and promoted the message to a top-level message for exchange between simulation environments and (external) dynamic models.

  • Extended LidarDetection message under feature data to include echo pulse width information to better support several sensor outputs that are currently on the market.

  • Extended OSI to include a generic external reference message to enable traceability for data from other standards, including other ASAM OpenX standards.

  • Added support for using OSI with Flatbuffers encoding for OSI messages instead of ProtoBuffers, in preparation for a potential switch of default encoding in a future major release.

  • Extended LaneBoundary message to include additional types, to better align with ASAM OpenDRIVE.

1.2. Deliverables

The following deliverables are provided for ASAM OSI®:

  • User guide, that is, this document (partly normative)

  • .proto files (normative)

  • Doxygen reference documentation (not normative)

  • OSMP packaging (OSMP packaging has its own version number, but is compatible with the version of ASAM OSI®.)

Related topics

1.3. ASAM OSI® repositories

ASAM OSI® is an open-source standardization project. OSI and its supporting tools are developed publicly on GitHub.

The source code and documentation for OSI and OSI-related tools are spread over several repositories:

open-simulation-interface

Main repository containing the interface description based on Google’s Protocol Buffers, including helper scripts and test scripts. Hosts the .proto files as well as the Doxygen reference documentation.

osi-documentation

Source for the OSI user guide sources and hosts the ASAM OSI® user guide.

osi-sensor-model-packaging

Packaging specification for OSI models used in FMI 2.0 [2] simulation environments, including examples.

proto2cpp

Doxygen filter for creating the reference documentation from OSI .proto files.

1.4. Normative and non-normative statements

This specification uses a standard information structure. The following rules apply regarding normativity of the different sections:

  • Statements expressed as requirements, permissions, or prohibitions according to the use of modal verbs, as defined in Section Modal verbs, are normative.

  • Rules in "Rules" sections are normative.

  • .proto files are normative.

  • Examples, use case descriptions, and instructions are non-normative.

  • Doxygen reference documentation is non-normative.

1.5. Conventions

1.5.1. Units

Every field has its own unit that is defined in the proto files. For more information, see the example in the contribution guidelines.

1.5.2. Modal verbs

To ensure compliance with the ASAM OSI® standard, users need to be able to distinguish between mandatory requirements, recommendations, permissions, as well as possibilities and capabilities.

The following rules for using modal verbs apply:

Table 1. Rules for using modal verbs
Provision Verbal form

Requirements
Requirements shall be followed strictly in order to conform to the standard. Deviations are not allowed.

shall
shall not

Recommendations
Recommendations indicate that one possibility out of the several available is particularly suitable, without mentioning or excluding the other possibilities.

should
should not

Permissions
Permissions indicate a course of action permissible within the limits of ASAM OSI® deliverables.

may
need not

Possibilities and capabilities
Verbal forms used to state possibilities or capabilities, whether technical, material, physical, etc.

can
cannot

Obligations and necessities
Verbal forms used to describe legal, organizational, or technical obligations and necessities that are not regulated or enforced by the ASAM OSI® standard.

must
must not

1.5.3. Typographic conventions

This documentation uses the following typographical conventions:

Table 2. Typographical conventions
Mark-up Definition

Code elements

This format is used for code elements, such as technical names of classes and attributes, as well as attribute values.

Technical concepts

This format is used for technical concepts. Technical concepts as opposed to code elements do not have a special highlighting.

Code snippets

This format is used for excerpts of code that serve as an example for implementation.

Terms

This format is used to introduce glossary terms, new terms and to emphasize terms.

Mathematical elements

This format is used for calculations and mathematical elements.

<element>

This describes a tag for an XML element

@attribute

The "@" identifies an attribute of an XML element.

1.6. Versioning and compatibility

The version number is defined in InterfaceVersion::version_number in osi_version.proto as the field’s default value.

Major

A change in the major version makes the code and recorded proto messages incompatible.

Major changes include:

  • An existing field with a number changing its meaning. Example: optional double field = 1; changes to repeated double field = 1;.

  • Changing the definition of units or the interpretation of a field.

  • Deleting a field and reusing the field number.

  • Changing the technology from Protocol Buffers to FlatBuffers.

Minor

A change in the minor version indicates there is still compatibility with previously recorded files. However, the code needs fixing.

Minor changes include:

  • Renaming a field without changing the field number.

  • Changing the names of messages.

  • Adding a new field in a message without changing the numbering of other fields.

Patch

Both recorded files and code still have compatibility.

Patches include:

  • File or folder structure that does not affect integration of the code in other projects.

  • Changing or adding comments.

  • Clarifying text passages explaining the message content.

1.7. Relationships with other standards

1.7.1. Positioning of ASAM OSI® within ASAM activities

ASAM OSI® (Open Simulation Interface) is part of the ASAM simulation standards that focus on simulation data for the automotive environment. Next to ASAM OSI®, ASAM provides other standards for the simulation domain, like OpenDRIVE [3], OpenSCENARIO [4] and OpenCRG [5].

OpenDRIVE defines a storage format for the static description of road networks. In combination with OpenCRG it is possible to add very detailed road surface descriptions to the road network. OpenDRIVE and OpenCRG only contain static content. To add dynamic content, OpenSCENARIO is needed. Combining all three standards provides a scenario-driven description of traffic simulation that contains static and dynamic content.

1.7.2. References to other standards

2. Open Simulation Interface

2.1. Idea behind Open Simulation Interface

ASAM OSI® Open Simulation Interface is a specification for interfaces between models and components of a distributed simulation. OSI is strongly focused on the environmental perception of automated driving functions. However, OSI also specifies interfaces for modeling traffic participants.

2.2. The basic design of OSI

2.2.1. Overview of OSI architecture

OSI contains an object-based environment description that uses the message format of the Protocol Buffer library. Google developed and maintains the Protocol Buffer library. OSI defines top-level messages that are used to exchange data between separate models. Top-level messages define the GroundTruth interface, the SensorData interface, and – since OSI version 3.0.0 – the interfaces SensorView, SensorViewConfiguration, and FeatureData.

The following figure shows the interfaces and models involved in modeling a sensor.

1100
Figure 1. Open Simulation Interface overview

OSI also defines interfaces for traffic participant models. The TrafficCommand interface makes it possible to send commands to traffic participant models. The TrafficUpdate interface makes it possible to receive the updated state from traffic participant models. The following figure shows the interfaces of a generic traffic participant.

1100
Figure 2. Interface of a traffic participant

Traffic participant models may use other OSI interfaces internally, for example, to model autonomous vehicles. The following figure shows a more advanced use case for traffic participants.

1100
Figure 3. Traffic participant with sensor models, AD function, and dynamic model

The HostVehicleData interface describes the measured internal states of a traffic participant. OSI currently provides only limited support for data structures that describe measured internal states of traffic participants. Actuator intentions are currently not covered by OSI and must be handled using a different data description format.

OSI uses singular instead of plural for repeated field names.
All fields in an interface are set to optional. required is not used.

This has been done to allow backward-compatible changes in the field. Additionally, this is the default behavior in Protocol Buffer version 3 that no longer has the required type. Setting all fields to optional thus ensures update compatibility. However, this does not mean that it is optional to fill the field. For the purpose of providing a complete interface, all existing fields should be set, unless not setting a field carries a specific meaning, as indicated in the accompanying comment.

All field numbers equal to or greater than 10000 are available for user-specific extensions via custom fields. No future evolution of OSI will therefore use field numbers equal to or greater than 10000.

2.2.2. Top-level interfaces

Ground truth

GroundTruth messages describe the simulated environment containing all simulated objects in the global coordinate system at consecutive time instances. They are based on data available to the simulation environment. GroundTruth messages are typically contained in Sensorview messages.

Feature data

FeatureData messages contain detected features in the reference frame of a sensor. FeatureData messages are generated from GroundTruth messages. They serve, for example, as an input to sensor models simulating object detection or feature fusion models.

Sensor view

The sensor view provides the input to OSI sensor models. SensorView messages are derived from GroundTruth messages. All information regarding the environment is given with respect to the virtual sensor coordinate system, with two exceptions:

  • Physical technology-specific data, given with respect to the physical sensor coordinate system specified in the corresponding physical sensor’s mounting position. One example of technology-specific data is: image_data of osi3::CameraSensorView

  • Ground truth given in the global coordinate system.

Sensor-view configuration

The sensor view is flexibly defined to provide different kinds of sensor models with an appropriate input. The sensor-view configuration defines the configuration of a particular sensor view.

The SensorViewConfiguration message is used in the initialization phase of a simulation to negotiate the sensor-view configuration for a particular SensorView input. It is also included as a sub-message in SensorView messages to indicate that the sensor-view configuration is valid for a particular SensorView message.

SensorViewConfiguration data has two main applications:

  • Enable the environment simulation to provide the necessary input to a sensor model.

  • Enable a sensor model to check whether the input matches its requirements. If the input does not match the requirements, the sensor model may terminate the simulation.

SensorViewConfiguration data is intended for the automatic configuration of the SensorView interface between an environment simulation and sensor model. The data is not intended to be a mechanism for parametrizing a generic sensor model.

During the initialization phase, there are two sources for SensorViewConfiguration data:

  1. SensorViewConfiguration data may be provided by the sensor model to the environment simulation. In this case, the data describes the input configuration that is requested by the sensor model. If the sensor model does not provide such data, then the environment simulation will fall back to manual configuration of the sensor view.

  2. SensorViewConfiguration data may be provided by the environment simulation. In response to the request by the sensor model, or based on manual configuration, the environment simulation configures the input and provides a new message that describes the actual configuration.

The configuration requested by the sensor model may differ from the configuration provided by the environment simulation. This happens when the environment simulation does not support a requested configuration or when the requested configuration is ambiguous.

In response to this difference, the sensor model can either accept this difference and adapt to it, or it can terminate the simulation to indicate that it is not able to accept the difference.

The packaging layer defines the specifics of this auto-negotiation mechanism.

After the initialization phase, the environment simulation provides the actual sensor-view configuration as part of each SensorView message.

Sensor data

SensorData messages imitate the output of real sensors. They can be generated from GroundTruth messages, SensorView messages, FeatureData messages, or SensorData messages. With the exception of feature data, all information regarding the environment is given with respect to the virtual sensor coordinate system. Feature data is given with respect to the physical sensor coordinate system. Sensor data can be used as input for an automated driving function, a sensor model simulating limited perception, or a sensor fusion model.

Traffic command

TrafficCommand messages contain control commands from the scenario engine to traffic participant models.

Traffic update

TrafficUpdate messages are provided by traffic participants. They provide updates on the position, state, and future trajectory of a traffic participant back to the simulation environment.

2.2.3. Model types

Environmental effect model

Environmental effect models consume SensorView messages and produce SensorView messages. Environmental effect models may, for example, alter SensorView messages to include effects and phenomena caused by:

  • Shadows and occlusions

  • Weather effects

  • Physics of a sensor

  • Pre-processing of raw sensor data

Sensor model

Sensor models consume SensorView messages and produce SensorData messages. Sensor-model output does not represent raw data but detected features or classified objects.

Logical model

Logical models consume SensorData messages and produce SensorData messages.

An example of a logical model is a sensor-fusion model, which combines the output of multiple sensor models to produce data with less uncertainty. Another use case is the fault-injection model which, contrary to a sensor-fusion model, may be used to increase uncertainties.

Traffic participant

A traffic participant is an element of the simulated world and can change its state during simulation time, for example, its position and orientation. A traffic participant represents one of the following:

  • Living being.

  • Means of transportation for living beings

  • Means of transportation for goods

  • Any other movable object that may travel on the road network

Pedestrians and animals are examples of traffic participants that are living beings. Vehicles are examples of traffic participants that are a means of transportation. The ego vehicle is therefore also a traffic participant.

The following figure shows the interface of a traffic participant.

1100
Figure 4. Interface of a traffic participant

Traffic participant models may use other OSI interfaces internally, for example, to model autonomous vehicles. The following figure shows a more advanced use case for traffic participants.

1100
Figure 5. Traffic participant using other OSI interfaces internally

With every simulation step, an OSI traffic participant model receives ground-truth data from the environment around itself, the sensor view. A traffic participant can output its own perceivable state, the traffic update. Traffic commands influence the behavior of the traffic participant model. They allow event-based communication towards the traffic participant, that is, at certain simulation steps. Traffic commands do not necessarily need to come from the environment simulation. They may come from a separate source, such as a scenario engine.

Modeling a traffic participant

Different models may be involved in modeling a traffic participant. In all the use cases, a simulator loads and interprets a scenario and a map prior to execution. The scenario is, for example, provided by OpenSCENARIO. The map data is, for example, provided by OpenDRIVE. During runtime the simulator interacts with the traffic participants via OSI messages. There may be multiple instances of a traffic participant. The traffic participants are co-simulated.

The following figure shows a very simple use case.

1100
Figure 6. Simple traffic participant

The traffic participant bases its behavior only on an idealized view of the area around it. The traffic participant’s dynamics are included in the model if they exist.

The following figure shows a traffic participant with separately modeled behavior and dynamics.

1100
Figure 7. Traffic participants with separate dynamics

OSI currently provides only limited support for data structures that describe measured internal states of the traffic participant. OSI does not currently cover actuator intentions. These must be handled with a different data description format.

The following figure shows a more complex traffic participant.

1100
Figure 8. Traffic participant with sensor models, AD function, and dynamics model

This use case will probably be relevant for modeling the ego vehicle, which includes the system under test. The traffic participant includes an arbitrary number of sensor models. The sensor models consume sensor view and produce sensor data. The AD function consumes sensor data and produces input for the dynamics model. OSI currently does not support data flow to dynamics models. The loop to the environment simulation is closed via traffic update.

The following figure shows a cooperative use case with both an AD function and a human driver.

1100
Figure 9. Traffic participant with an AD function and human driver

It is possible to model a traffic participant with an AD function in the loop, but a human driver can still override the actuation command. This type of cooperative use case is, for example, relevant to studies on human-machine interaction. In this example, a virtual on-screen representation of the scenario, or mock-up, is added after the AD function. The driver-in-the-loop interacts with the dynamics model via this mock-up. OSI’s limitations regarding dynamics-model input apply in this example as well.

2.2.4. Coordinate systems and reference points

Coordinate systems, reference points and coordinate transformation

OSI uses DIN ISO 8855:2013-11 [6] for coordinate systems and transformations between coordinate systems. OSI uses three coordinate systems:

Global coordinate system

Coordinate system for all entities that are part of ground truth. The global coordinate system is an inertial x/y/z-coordinate system. The origin is the global reference point that is determined by the environment simulation. This reference point may be derived from map data or other considerations. Global coordinates can be mapped to a geographic coordinate system via osi3::GroundTruth::proj_string.

Sensor coordinate system

Coordinate system for all entities that are part of sensor data. The origin is the mounting position of the physical sensor or a virtual mounting position, depending on the OSI message.

Object coordinate system

Local object coordinate system. The origin of the corresponding coordinate system is not necessarily identical to the center of the object’s bounding box. If the origin of the corresponding coordinate system is not identical to the center of the object’s bounding box, the object documentation will provide the actual definition.

Coordinate transformations
Vehicle and sensor coordinate systems

When running simulations, it is frequently necessary to transform coordinates from the global coordinate system for a specific vehicle and its sensors.

This section provides an overview of the messages and fields involved and their relationship for this task. It demonstrates how a global coordinate system, vehicle coordinate system, and sensor coordinate system are related on the basis of a specific (ego) vehicle.

Mathematical Definitions of Coordinate Transformations

All vectors and matrices are noted with reference frame as a superscript index and the direction of translation as a supscript index. [7] The translation direction is from the first index to the second index (src: source coordinate system, trg: target coordinate system). The vector \(\boldsymbol{v}^x\) denotes the 3D position of an object in the coordinate frame \(x\). Vector \(\boldsymbol{t}\) is the translation vector between two coordinate systems with the described indices for reference frame and direction. The angles yaw \(\psi\) around the z-axis, pitch \(\theta\) around the y-axis and roll \(\phi\) around the x-axis are defined in a right handed coordinate system according to DIN ISO 8855:2013 [6]. The sign of the angles corresponds to the direction of the transformation.

Transformation from source \(src\) to target \(trg\) coordinates:

\[\boldsymbol{v}^{trg} = \boldsymbol{R}_{src}^{trg} (\boldsymbol{v}^{src} - \boldsymbol{t}_{src,trg}^{src})\]

Transformation back from target \(trg\) to source \(src\) coordinates

\[\boldsymbol{v}^{src} = (\boldsymbol{R}_{src}^{trg})^{-1} \boldsymbol{v}^{trg} + \boldsymbol{t}_{src,trg}^{src}\]

With the rotation matrix (from rotating the coordinate system) [8]:

\[\boldsymbol{R}_{srv}^{trg}=\boldsymbol{R}_{yaw,pitch,roll} = \boldsymbol{R}_{z,y,x} = \boldsymbol{R}_{x}(\phi) \boldsymbol{R}_{y}(\theta) \boldsymbol{R}_{z}(\psi) \\ \boldsymbol{R}_{z,y,x} = \begin{pmatrix} 1 & 0 & 0\\ 0 & \cos(\phi) & \sin(\phi)\\ 0 & -\sin(\phi) & \cos(\phi) \end{pmatrix} \begin{pmatrix} \cos(\theta) & 0 & -\sin(\theta)\\ 0 & 1 & 0\\ \sin(\theta) & 0 & \cos(\theta) \end{pmatrix} \begin{pmatrix} \cos(\psi) & \sin(\psi) & 0\\ -\sin(\psi) & \cos(\psi) & 0\\ 0 & 0 &1 \end{pmatrix} \\ \boldsymbol{R}_{z,y,x} = \begin{pmatrix} \cos(\theta)\cos(\psi) & \cos(\theta)\sin(\psi) & -\sin(\theta)\\ \sin(\phi)\sin(\theta)\cos(\psi)-\cos(\phi)\sin(\psi) & \sin(\phi)\sin(\theta)\sin(\psi)+\cos(\phi)\cos(\psi) & \sin(\phi)\cos(\theta)\\ \cos(\phi)\sin(\theta)\cos(\psi)+\sin(\phi)\sin(\psi) & \cos(\phi)\sin(\theta)\sin(\psi)-\sin(\phi)\cos(\psi) & \cos(\phi)\cos(\theta) \end{pmatrix}\]

Get Tait–Bryan angles from rotation matrix [9]:

\[\theta = -\arcsin(R_{13}) \\ \psi = \arctan2(R_{12}/\cos(\theta),R_{11}/\cos(\theta)) \\ \phi = \arctan2(R_{23}/\cos(\theta),R_{33}/\cos(\theta))\]

Relative orientation:

Object rotation Matrix: \(\boldsymbol{R}_{object}^{src}\)
Host vehicle rotation Matrix: \(\boldsymbol{R}_{ego}^{src}\)
Resulting rotation matrix between object and host: \(\boldsymbol{R}_{object}^{src}(\boldsymbol{R}_{ego}^{src})^{T}\)

To transform from world coordinates into vehicle coordinates and back use the formulas from above with the world coordinates frame \(w\) as source system \(src\) and vehicle coordinates frame \(v\) as target system \(trg\). To transform from world coordinates into vehicle coordinates and back use the formulas from above with the vehicle coordinates frame \(v\) as source system \(src\) and sensor coordinates frame \(s\) as target system \(trg\).

Corresponding messages

GroundTruth::moving_object::base::position

This field defines the position of the vehicle’s reference point in global coordinates. In Open Simulation Interface, an object’s position is defined by the coordinates of the center of the object’s 3D bounding box.

GroundTruth::moving_object::base::orientation

This field defines the orientation of the vehicle’s reference point in global coordinates.

GroundTruth::moving_object::vehicle_attributes::bbcenter_to_rear

This field specifies the vector pointing from the vehicle’s reference point to the middle of the rear axle under neutral load conditions in the vehicle coordinates.

SensorData::mounting_position

This field defines the sensor’s position and orientation and thereby the origin of the sensor coordinate system. The mounting position is given in the vehicle coordinate system.

Example

The following image shows the relationship between the coordinate systems. The reference point of the vehicle is given by a vector in the global coordinate system. A vector pointing from the reference position of the vehicle to the center of the rear axle then yields the origin of the vehicle coordinate system. The mounting positions of the sensors and therefore the origins of the corresponding sensor coordinate systems are given with respect to the vehicle coordinate system.

osi example coordinate systems

2.2.5. Layering

Data layer

The OSI data layer is defined in the message specifications using the ProtoBuf IDL [10]. This defines the data that can be transmitted using OSI, including the structure and the semantics of the data.

Additionally, it specifies the encoding to be used when OSI data is transmitted. Currently, ProtoBuf encoding is used, but other encodings are possible with the ProtoBuf IDL. FlatBuffer encoding has been implemented as an experimental feature.

The data layer does not directly define components and transmission routes. These are defined in the OSI packaging layer. There may be different packaging layer implementations using the shared data layer definitions. The data that is exchanged remains compatible regardless of the packaging layer implementation. The use of a shared data layer ensures easy bridging between different packaging layer implementations.

Packaging layer

The OSI packaging layer specifies how components that use the OSI data layer, for example, sensor models, are packaged for exchange.

This specifies model types and their mandatory and optional OSI inputs, OSI outputs, and parameter interfaces. A model type may be, for example, a sensor model or a traffic participant model. The packaging layer also specifies component technology standards. This makes it possible to encapsulate model types in easily exchangeable component packages that can be used across platforms and implementations.

Multiple packaging layer implementations are possible within the OSI framework. The shared data layer ensures easy bridging between the different implementations. The currently defined central packaging layer is the OSI Sensor Model Packaging (OSMP) specification. It is based on FMI 2.0 [2] and uses certain additional conventions to allow packaging of OSI using models as FMUs.

2.2.6. OSI trace files

OSI trace file formats

There are multiple formats for storing multiple serialized OSI messages in one trace file.

*.osi

Binary trace file. Messages are separated by a length specification before each message. The length is represented by a four-byte, little-endian, unsigned integer. The length does not include the integer itself.

*.txt

Plain-text trace file. Messages are separated by __.

*.txth

Human-readable plain-text trace file. Messages are separated by newlines. These files may be used for manual checks.

OSI trace file naming conventions

Name format

The names of OSI trace files should have the following format:

<timestamp>_<type>_<osi-version>_<protobuf-version>_<number-of-frames>_<custom-trace-name>.osi

Types

sd

Trace file contains SensorData messages.

sv

Trace file contains SensorView messages.

gt

Trace file contains GroundTruth messages.

tu

Trace file contains TrafficUpdate messages.

tc

Trace file contains TrafficCommand messages.

Example

Given an OSI trace file with the following information:

Timestamp (ISO 8601) [11]

20210818T150542Z

Type

SensorView

OSI version

3.1.2

Protobuf version

3.0.0

Number of frames

1523

Custom trace name

highway

The recommended file name is:

20210818T150542Z_sv_312_300_1523_highway.osi
Trace-file formatting scripts

The OSI repository contains Python scripts for converting trace files from one format to another. The formatting scripts are stored in open-simulation-interface/format/

txt2osi.py

txt2osi.py converts plain-text trace files to binary .osi trace files. This script takes the following parameters:

--data, -d

String containing the path to the file with serialized data.

--type, -t

Optional string describing the message type used to serialize data. 'SensorView', 'GroundTruth', or 'SensorData' are permitted values. The default value is 'SensorView'.

--output, -o

Optional string containing the name of the output file. The default value is 'converted.osi'.

--compress, -c

Optional Boolean controlling whether to compress the output to an lzma file. True, or False are permitted values. The default value is False.

osi2read.py

osi2read.py converts trace files to human-readable .txth trace files. This script takes the following parameters:

--data, -d

String containing the path to the file with serialized data.

--type, -t

Optional string describing the message type used to serialize data. 'SensorView', 'GroundTruth', or 'SensorData' are permitted values. The default value is 'SensorView'.

--output, -o

Optional string containing the name of the output file. The default value is 'converted.txth'.

--format, -f

Optional string containing the format type of the trace file. 'separated', or None are permitted values. The default value is None.

Related topics

2.3. Setting up OSI

2.3.1. Installing OSI for C++ on Linux

Prerequisites

  • You have installed cmake.

  • You have installed protobuf.

  • You must have super user privileges.

Steps

  1. Open a terminal.

  2. Clone the Open Simulation repository.

    git clone https://github.com/OpenSimulationInterface/open-simulation-interface.git
  3. Switch to the repository directory.

    cd open-simulation-interface
  4. Create a new directory for the build.

    mkdir build
  5. Switch to the new directory.

    cd build
  6. Run cmake. To build a 32-bit target under 64-bit Linux, add -DCMAKE_CXX_FLAGS="-m32" to the cmake command. In this case, protobuf must be in 32-bit mode too.

    cmake ..
  7. Run make.

    make
  8. Install Open Simulation Interface.

    sudo make install

2.3.2. Installing OSI for Python on Linux

Prerequisites

  • You have installed pip3.

  • You have installed python-setuptools.

  • You have installed protobuf.

  • For a local installation, you have installed virtualenv.

Steps

  1. Open a terminal.

  2. Clone the Open Simulation repository.

    git clone https://github.com/OpenSimulationInterface/open-simulation-interface.git
  3. Switch to the repository directory.

    cd open-simulation-interface
  4. Create a new virtual environment.

    virtualenv -p python3 venv
  5. Activate the virtual environment.

    source venv/bin/activate
  6. Install Open Simulation Interface.

    1. Local installation

      pip3 install .
    2. Global installation

      sudo pip3 install .

2.3.3. Installing OSI for C++ on Windows

Prerequisites

  • You have installed cmake as an administrator.

  • You have installed protobuf as an administrator.

Steps

  1. Open a terminal as administrator.

  2. Clone the Open Simulation repository.

    git clone https://github.com/OpenSimulationInterface/open-simulation-interface.git
  3. Switch to the repository directory.

    cd open-simulation-interface
  4. Create a new directory for the build.

    mkdir build
  5. Switch to the new directory.

    cd build
  6. Run cmake. To build a 64-bit target, add Win64 to the generator name. In this case, protobuf and protoc.exe must be in 64-bit mode too.

    cmake .. [-G <generator>] [-DCMAKE_INSTALL_PREFIX=<osi-install-directory>]
  7. Build and install OSI.

    cmake --build . [--config Release]
    cmake --build . --target install

2.3.4. Installing OSI for Python on Windows

Prerequisites

  • You have installed Python with administrator rights.

  • Make sure Python is added to PATH.

Steps

  1. Open a terminal.

  2. Clone the Open Simulation repository.

    git clone https://github.com/OpenSimulationInterface/open-simulation-interface.git
  3. Switch to the repository directory.

    cd open-simulation-interface
  4. Run the setup script.

    python setup.py install

3. OSI Sensor Model Packaging

3.1. Introduction

OSI Sensor Model Packaging (OSMP) is a package layer specification for the Open Simulation Interface (OSI). It specifies how models that use OSI are packaged as Functional Mock-up Units (FMUs) in accordance with the Functional Mock-up Interface 2.0 (FMI 2.0).

This is version 1.3.0 of this specification. The version number is to be interpreted according to the Semantic Versioning Specification 2.0.0.

3.2. OSMP specification

3.2.1. Model types

The current specification supports packaging the following model types as Functional Mock-up Units (FMUs):

Environmental effect model

This model type can be used to model environmental effects or the physical parts of sensors. It consumes osi3::SensorView as input and produces osi3::SensorView as output.

Sensor models

This model type is used to model the perception function of sensors. It consumes osi3::SensorView as input and produces osi3::SensorData as output.

Logical models

This model type is used to model the further processing of sensor output, for example, sensor fusion. It consumes osi3::SensorData as input and produces osi3::SensorData as output.

Traffic participant models

This model type is used to model whole traffic participants, such as vehicles or pedestrians. Traffic participant models may internally use environmental effect models, sensor models, or logical models as part of a modeled autonomous vehicle. They may also be used to implement surrounding traffic in simplified ways. Traffic participant models consume osi3::SensorView as input and produce osi3::TrafficUpdate as output. They may also consume osi3::TrafficCommand as input to allow control by a scenario engine or other coordinating function. They may also produce osi3::TrafficCommandUpdate as output to allow status responses to such control messages.

All models may also consume a global osi3::GroundTruth parameter during initialization.

Complex models may combine various aspects of the above model types. Manual intervention is needed to configure and set up these FMUs.

3.2.2. Basic conventions

The model shall be packaged as a valid FMU for co-simulation, as specified in the FMI 2.0 standard [2]. Unless otherwise noted, all specifications in the FMI 2.0 [2] standard apply as-is.

The following annotation shall be placed into the <VendorAnnotations> element of the modelDescription.xml to mark the FMU as being conformant to this version of the specification:

<Tool name="net.pmsf.osmp" xmlns:osmp="http://xsd.pmsf.net/OSISensorModelPackaging"><osmp:osmp version="1.3.0" osi-version="x.y.z"/></Tool>

The @osi-version attribute should contain the major, minor, and patch version number of the OSI specification that this model was compiled against. Indicating the OSI version ensures that the importing environment can determine which OSI version to use prior to communicating with the FMU.

If OSMP is used without OSI data being transported across binary variables, @osi-version should not be specified.

The variable naming convention of the FMU shall be structured.

The default experiment step size should be defined. It should indicate the actual model refresh rate for the input side. A simulator can call the FMU fmi2DoStep routine at this implied rate. If the step size is not supplied, the model communication rate is determined from any input configuration data the model provides, or it must be configured manually.

The model may have inputs, outputs, and parameters that are not specified by OSMP if the model can be run correctly with all of those variables left unconnected and at their default values.

3.2.3. Binary variables

FMI 2.0 [2] does not directly support the efficient exchange of arbitrary binary data between FMUs. OSMP therefore introduces the concept of notional binary variables that are mapped to actual integer variables for use with FMI 2.0 [2]. Future FMI versions will directly support binary variables compatible with this concept.

A notional binary variable named <prefix> is defined using the following conventions:

The name of the notional binary variable given by <prefix> shall be a valid structured name according to FMI 2.0 [2].

The FMU shall not contain any other variable that is named <prefix>. This restriction ensures that there is no conflict between notional binary variables and actual variables.

For each notional binary variable, three actual FMU integer variables shall be defined:

<prefix>.base.lo

Lower, meaning the least significant, 32-bit address part of the binary data buffer to be passed into or out of the model, cast into a signed 32-bit integer without changing the bit values.

<prefix>.base.hi

Higher, meaning the most significant, 32-bit address part of the binary data buffer to be passed into or out of the model, cast into a signed 32-bit integer without changing the bit values. Note that this variable is only used for 64-bit platforms. For 32-bit platforms, it shall still be present but will always be 0 to support FMUs with 32-bit and 64-bit implementations.

<prefix>.size

Size of the binary data buffer to be passed into or out of the model as a signed 32-bit integer. This restricts the maximum size of binary data buffers being passed around to a size less than 2 GB.

The three actual variables shall have matching causality and variability, which will be the causality and variability of the notional binary variable.

The variables shall have a start value of 0, indicating that no valid binary data buffer is available. The variables may have a different or no start value if the combination of causality and variability precludes this, for example, for @variability = fixed or @variability = tunable and @causality = calculatedParameter.

Model FMUs shall interpret values of 0 for the merged base address or the size to indicate that no valid binary data buffer is available. Models FMUs shall handle this case safely.

The three actual variables shall contain an annotation of the following form in the <Annotations> child element of their <ScalarVariable> element of the modelDescription.xml:

<Tool name="net.pmsf.osmp" xmlns:osmp="http://xsd.pmsf.net/OSISensorModelPackaging"><osmp:osmp-binary-variable name="<prefix>" role="<role>" mime-type="<mime-type>"/></Tool>

<prefix> is the prefix as defined above, and @role is either base.lo, base.hi, or size, depending on the variable.

It is an error if there is not exactly one variable of each role for the same name.

The MIME type given in @mime-type shall be a valid MIME type specification.

It is an error if the MIME types specified in the annotations for one notional binary variable differ.

In the case of OSI-specified data, the MIME type shall have the following form to indicate that the binary content is conformant to the given OSI version and contains a message of the given type:

application/x-open-simulation-interface; type=<type>; version=x.y.z

<type> shall be the name of an OSI top-level message, excluding the osi3:: prefix.

The version parameter of the MIME type application/x-open-simulation-interface will default to the version specified in the @osi-version attribute as part of the top-level <osmp:osmp> annotation. It is an error if a version number is specified neither as part of the MIME type nor using the @osi-version attribute.

The guaranteed lifetime of the binary data buffer pointer transported through the actual variables is defined for each kind of variable, as specified in the following sections.

Generally the lifetime for inputs is from the time they are set to the time the corresponding co-simulation step calculation finishes. For outputs the lifetime is extended from the point the output is provided at the end of a co-simulation step until the end of the next co-simulation step.

This convention allows the use of FMUs in simulation engines that have no special support for the protocol buffer pointers: The simulation engine can rely on the provided buffer to remain valid from the moment it is passed out of a model until the end of the next co-simulation calculation cycle. Thus, the simulation engine does not need to copy the contents in that case, corresponding to zero-copy output for the simulation engine at the cost of double buffering for the model providing the output data. It is possible to daisy-chain FMUs with protocol buffer inputs and outputs in a normal simulation engine supporting FMI, and get valid results.

3.2.4. Sensor view inputs

Prefix

Sensor view inputs shall be named with the following prefix:

OSMPSensorViewIn

Rules

  • If only one sensor view input is configured, the prefix shall only be OSMPSensorViewIn.

  • If more than one sensor view input is configured, the prefix shall be extended by an array index, for example, OSMPSensorViewIn[1] and OSMPSensorViewIn[2].

  • Array indices shall start at 1 and shall be consecutive.

  • Each sensor view input shall be defined as a notional discrete binary input variable with @causality="input" and @variability="discrete".

  • The MIME type of the variable shall specify the type=SensorView as part of the MIME type parameters.

  • Sensor view data shall be encoded as osi3::SensorView.

  • The sensor view passed to the model shall contain data as specified by the parameter OSMPSensorViewInConfiguration.

  • The guaranteed lifetime of the sensor view protocol-buffer pointer provided as input to the FMU shall be from the time of the call to fmi2SetInteger that provides those values until the end of the following fmi2DoStep call.

3.2.5. Sensor view input configuration

For each notional sensor view input variable with the base prefix OSMPSensorViewIn, a corresponding calculated parameter with the base prefix OSMPSensorViewInConfigRequest and a parameter with the base prefix OSMPSensorViewInConfig can exist.

During FMI initialization mode, the simulation environment queries the value of OSMPSensorViewInConfigRequest. Taking this value into account, a suitable and supported sensor view input configuration is determined. Before exiting initialization mode, the simulation environment then sets this configuration using the corresponding OSMPSensorViewInConfig parameter.

Prefix

Sensor view input configurations shall be named with the following prefix:

OSMPSensorViewInConfig

Sensor view input configuration requests shall be named with the following prefix:

OSMPSensorViewInConfigRequest

Rules

  • If more than one sensor view input is to be configured, the prefix shall be extended by an array index, for example, OSMPSensorViewInConfigRequest[1], OSMPSensorViewInConfig[1], OSMPSensorViewInConfigRequest[2], and OSMPSensorViewInConfig[2].

  • Array indices shall start at 1, shall be consecutive, and shall correspond between sensor view inputs and sensor view configuration parameters.

  • If the calculated parameter OSMPSensorViewInConfigRequest exists, then the corresponding parameter OSMPSensorViewInConfig shall exist.

  • OSMPSensorViewInConfigRequest shall have a @causality = "calculatedParameter" and a @variability = "fixed" or @variability = "tunable".

  • OSMPSensorViewInConfig shall have a @causality = "parameter" and a @variability = "fixed" or @variability = "tunable".

  • The @variability values of OSMPSensorViewInConfigRequest and OSMPSensorViewInConfig shall match.

  • The MIME type of both variables shall specify type="SensorViewConfiguration" as part of the MIME type parameters.

  • The variable values shall be encoded as osi3::SensorViewConfiguration.

  • As long as no non-zero value has been assigned to OSMPSensorViewInConfig, the value of the corresponding OSMPSensorViewInConfigRequest shall be the desired sensor view configuration for the corresponding variable OSMPSensorViewIn. This configuration is based on model-internal requirements or any other parameters on which this calculated parameter depends.

  • Once a non-zero value has been assigned to OSMPSensorViewInConfig, the value of the corresponding OSMPSensorViewInConfigRequest shall be an encoded OSI protocol buffer containing the same data as the OSMPSensorViewInConfig.

  • During FMI initialization mode, the simulation environment should query the value of OSMPSensorViewInConfigRequest and determine a suitable sensor view input configuration.

  • Before exiting initialization mode, the simulation environment shall set the sensor view input configuration using the corresponding OSMPSensorViewInConfig parameter.

  • The guaranteed lifetime of the sensor view configuration protocol-buffer pointers shall be from the time of the call to fmi2SetInteger that provides those values until the end of the FMI initialization mode, indicating by the return of the fmi2ExitInitializationMode call.

3.2.6. Sensor view outputs

Prefix

Sensor view outputs shall be named with the following prefix:

OSMPSensorViewOut

Rules

  • If only one sensor view output is configured, the prefix shall only be OSMPSensorViewOut.

  • If more than one sensor view output is configured, the prefix shall be extended by an array index, for example, OSMPSensorViewOut[1] and OSMPSensorViewOut[2].

  • Array indices shall start at 1 and shall be consecutive.

  • Each sensor view output shall be defined as a notional discrete binary output variable with @causality="output" and @variability="discrete".

  • The MIME type of the variable shall specify the type="SensorView" as part of the MIME type parameters.

  • Sensor view data shall be encoded as osi3::SensorView.

  • The guaranteed lifetime of the sensor view protocol-buffer pointer provided as output by the FMU shall be from the end of the call to fmi2DoStep that calculated this buffer until the beginning of the second fmi2DoStep call after that.

3.2.7. Sensor data inputs

Prefix

Sensor data inputs shall be named with the following prefix:

OSMPSensorDataIn

Rules

  • If only one sensor data input is configured, the prefix shall only be OSMPSensorDataIn.

  • If more than one sensor data input is configured, the prefix shall be extended by an array index, for example, OSMPSensorDataIn[1] and OSMPSensorDataIn[2].

  • Array indices shall start at 1 and shall be consecutive.

  • Each sensor data input shall be defined as a notional discrete binary input variable with @causality="input" and @variability="discrete".

  • The MIME type of the variable shall specify the type="SensorData" as part of the MIME type parameters.

  • Sensor data shall be encoded as osi3::SensorData.

  • The guaranteed lifetime of the sensor data protocol-buffer pointer provided as input to the FMU shall be from the time of the call to fmi2SetInteger that provides those values until the end of the following fmi2DoStep call.

3.2.8. Sensor data outputs

Prefix

Sensor data outputs shall be named with the following prefix:

OSMPSensorDataOut

Rules

  • If only one sensor data output is configured, the prefix shall only be OSMPSensorDataOut.

  • If more than one sensor data output is configured, the prefix shall be extended by an array index, for example, OSMPSensorDataOut[1] and OSMPSensorDataOut[2].

  • Array indices shall start at 1 and shall be consecutive.

  • Each sensor data output sh