The condition of your devices at a glance
The device health metrics allow you to provide evidence of reliable and continuous data collection and diagnose potential problems yourself (e.g. regarding a stable network connection, power supply, etc.). The following metrics are supported:
Device uptime, status, reboots, available memory space
Device temperature (supported for BMA/P101/OP101/Jetson Nano)
LTE modem traffic, signal strength and reconnects (supported for BMA/OP100/OP101)
Camera status and processing speed of the camera (FPS)
Created and pending events
This section describes which status devices and camera streams can have and and what to expect in each case.
In the Control Center, you will find a basic monitoring status at both camera and device level. This status indicates whether your cameras are operational or whether action is required to get them running.
In the camera overview of your devices and dashboards, you will find the camera monitoring, which shows you whether your camera is working as expected. In the device configuration, you will find the device monitoring, which shows the worst status of all cameras running on the device. This allows you to see directly if something is wrong.
You can find out how to configure automatic e-mail notifications for status changes in this section
Status | Description |
---|---|
If, contrary to your expectations, your device is displayed as offline, please contact our Support
Device monitoring depends on the overall worst status to give you a good overview directly in your device list if a camera is not working as expected.
Status | Description |
---|---|
This status takes into account the system, the camera input and the MQTT connection.
Here you will find information on how to successfully connect and configure the camera.
To call up the configuration page of the Perception Box, simply click on the respective box in the list view. This allows you to manage all cameras running on this device.
You are of course completely free to name your BERNARD devices as you wish; we recommend a logical naming scheme at this point.
Click on one of the displayed cameras to open the Settings of the respective camera. You can name the camera there. At the top, you have the option of deactivating the camera stream. If a camera stream is deactivated, it is no longer taken into account by the software. The configuration is retained.
As soon as you have configured the camera connection (Camera Connection), you will see a preview of the camera image. You can now continue with the scenario configuration.
Now that you see your camera images, it's time for the Configuration. This is where the magic happens!
Since our software is mainly intended and used for specific use cases, you will find all the information you need for the perfect setup in the respective sections for your use case:
In the configuration, you can select the model suitable for your use case and configure any combination of event triggers and additional functions.
The model reflects the engine with which the BERNARD devices work.
Below you will find a brief description of each model. To decide which one you want or should use for your application, please read the section on the respective use case:
This model detects vehicles, people on two-wheelers and people in highly dynamic scenes (e.g. road traffic or highways, i.e. scenes in which objects move quickly).
This model detects vehicles, people on two-wheelers and people in scenes with low dynamics, for example parking lots where objects are not moving or are moving slowly. Since this model analyzes the video at a higher resolution, it can detect objects that are further away from the camera. This requires more computing power and is therefore currently only recommended for diesen Anwendungsfall
This model detects vehicles, people on two-wheelers and people in low dynamic scenes in parking lots when a fisheye camera is used. This does not work for fast, dynamic traffic scenes.
This model detects a person, or the entire body of a person. This makes it ideal for detecting, tracking and counting people when they are further away (>5 m) from the camera.
This model detects a person's head and is therefore ideal for detecting, tracking and counting people when they are closer (<6 m) to the camera.
When recognizing and tackling people, we never perform facial recognition. No sensitive personal information of any kind is processed. Contact us at any time if you have any questions about personal data or data protection.
Each event trigger generates a unique ID in the background. You can assign user-defined names so that you can keep track of all configured triggers. This name is then used to display the corresponding data in the corresponding data evaluation tool.
Below you will find explanations of the individual event types and triggers:
We provide various templates for the three different use cases to make configuration and general use as easy as possible for you.
Parking events: Templates for all use cases in the area of parking space monitoring
Traffic events: Templates for use in the area of traffic monitoring and traffic safety
People events: These templates are perfect for use with our People Full Body or People Head models
The description of the available event triggers and the individually customizable event settings can be found in the following table:
**Counting Lines (CL) trigger a count as soon as the center of an object crosses the line ** When configuring a CL, it is important that the camera perspective is taken into account.
The CL also logs the direction in which the object has crossed the line in In and Out. You can switch In and Out at any time to adjust the direction accordingly. In addition, a separate name can be assigned for the two directions
By default, a CL only counts each object once. If every crossing is to be counted, there is the option to activate events for repeated CL crossings. The only restriction here is that counts are only taken into account if there are five seconds in between.
In addition to the repeated CL crossings, ANPR and Speed Estimation are also available as triggers or settings.
Speed Estimation_ can be activated as a special trigger setting for a CL in the left sidebar. This adds another line that you can use to specify the distance in your scenario. For the best results, we recommend a straight line without curves/slopes.
Regions of Interest** count objects in a specific area.** In addition, the class (Class) and dwell time (Dwell Time) are determined and specified.
Depending on the scenario, we can distinguish between three types of RoI. We offer the templates described below for these three types:
These zones are used for origin-destination analyses. Counts are generated when an object moves through OD1 and then OD2. At least two areas must be configured for an OD.
The first zone that the object passes through is called the origin zone (Origin). The last zone that the object passes is therefore referred to as the destination zone (Destination).
Your devices send the results of the real-time analysis to an MQTT broker. The default configuration sends data to the Azure Cloud and to Data Analytics so that it can be retrieved there. If you would like to set up a custom MQTT, please contact our Support Team
Message compression can save up to 50% bandwidth used to send events to the MQTT broker. Please note that the broker and application must support decompression of zlib / inflate.
The Control Center is where you configure your purchased BERNARD hardware products
Under the Devices menu item in the Control Center, you can centrally manage all your BERNARD devices and configure the cameras so that they generate data according to your requirements for your use cases.
Below you will find a description of the different parts of the device configuration:
Number | Description |
---|---|
Status | Description |
---|---|
Single Space Parking | Multi Space Parking | Generic | |
---|---|---|---|
Event- Trigger
Duration (Time)
Duration (Time)
Duration (Time) or Status (Occupancy)
Event- Type
Parking
Parking
People and Traffic
Number of preset Objects
1
5
1
Color
dark green
purple
light green
The device is ready for operation and connected to power and the Internet.
The device is offline (no power, no internet, etc.). There are several simple steps you can check before contacting our support team.
Everything is fine - all cameras configured on your device are working as they should.
At least one camera on the device is not configured. Check the status in the camera monitoring for more details.
At least one camera is not sending data as expected.
At least one camera on the device has the status Warning.
The device is offline. Check that the hardware is connected to the power supply and that there is a network connection.
If you have just changed the configuration of one of the cameras, this status is displayed for a maximum of five minutes before the status is determined again.
At least one camera is deactivated.
Everything is fine and your camera is running as expected. The software is also running smoothly, the camera connection is available and the MQTT broker is connected.
The camera is not configured. You must configure the camera and data connection as well as your respective configuration according to your use case.
Data is still being generated and delivered, but there are problems that could affect the accuracy of the data. Problem types: a) Video images cannot be retrieved correctly; at least 10% of the images are incorrect b) Performance issues: The number of fps is below the minimum limit of the configured event types
The software does not run; no data is generated. Problem types: a) Docker container and/or software is not running correctly, please contact support. b) Data cannot be sent to MQTT endpoint: there are more than 10 MQTT events that have been sent to MQTT broker for at least 10 seconds without success. Please check whether the broker is active. c) Camera not connected: The camera connection cannot be established. Check whether the camera is switched on and whether the camera data (user name, password) is configured correctly.
The Swarm Perception Box or your hardware is offline. Check the power supply and network connection.
If you have recently adjusted the configuration, the status is set to Pending for about five minutes until the correct status is determined.
The stream is deactivated and can only be reactivated if sufficient licenses are available. This status can also be used to save a configuration for a later point in time without the device currently being required.
Switch between devices, Data Analytics and administration
Device Name/ID of your hardware. You can customize the device name as you wish. The ID is used for communication between the edge device and Azure Cloud.
Metadata Organize your devices and create events with predefined metadata. You can define up to five key and value pairs for a device. The keys and values can be freely defined, we support the automatic completion of keys to avoid annoying typing errors. After defining metadata, you can filter the list of devices by metadata values, and the generated events contain the predefined metadata for further processing by your application. Details can be found in the event schema.
This connection status (Connection) indicates whether a connection has been established between your hardware and the Management Hub (Azure). Available statuses are Online, Offline or Unknown. If a device is unexpectedly offline, please contact our support.
The Status column provides information on whether the software is running on the respective device. Please refer to the section Camera and device control.
Auto refresh: This option allows you to automatically refresh the page as soon as changes are made or a status changes.