Alert-lib

Alert-lib is a library that contains utilities built on top of the camera layer that provides functionality related to distance estimation, speed calculation, alerts, and AR overlays.

Distance estimation

The distance filter estimates the distance of an object from the camera.

The StaticDistanceEstimator uses an assumed width/height and the camera's properties to determine the camera's distance from an object. The estimator works with cars and pedestrians by default, but may be extended if other objects are needed for your use case. Distances are in meters for default values. For any custom values, the distance unit is the same as the provided size.

The following list describes the method parameters:

  • recognitions - List of recognitions to process.
  • frameSize - Size of the image that recognitions were derived from at 0 rotation.
  • sensorRotation - Clockwise rotation to each frames's natural orientation as a multiple of 90 degrees.
  • focalLength - Focal length of the capture camera in millimeters.
  • sensorSize - Size of the physical camera sensor in millimeters.

Focal length for the camera pixel buffer can be fetched using the following code :

extension CMSampleBuffer {
    /// Extracts the metadata dictionary from a `CMSampleBuffer` extension.
    ///  (for example, EXIF: `Aperture`, `Brightness`, `Exposure`, `FocalLength`, etc)
    ///
    /// - Parameter sampleBuffer: represents the sample buffer to be processed.
    /// - Returns: Returns the metadata dictionary from the provided sample buffer.
    public func metadata() -> [String : Any]? {
        if let cfmetadata = CMCopyDictionaryOfAttachments(allocator: kCFAllocatorDefault, target: self, attachmentMode: kCMAttachmentMode_ShouldPropagate) {
            if let metadata = cfmetadata as? [String : Any] {
                return metadata
            }
        }
        return nil
    }
}
let staticDistance = LSDStaticDistanceEstimator()

var recognitions :   [LSDRecognizedObject] =  [LSDRecognizedObject]()

let objectRecognition = LSDRecognizedObject()
objectRecognition.uuid = UUID()
objectRecognition.label = "type"
objectRecognition.bbox = CGRect(x: 100, y: 100, width: 250, height: 250)
recognitions.append(objectRecognition)

let sensorSize = CGSize(width: 5.6, height: 4.2) //DISCLAIMER : iPhone XR/ 11

staticDistance.estimateDistance(recognitions: recognitions, frameSize: CGRect(x: 0, y: 0, width: 640, height: 480), sensorRotation: 0, focalLength: 3.99, sensorSize: sensorSize)

Note

The distance value for a Recognition may already be available from the ML model.

RoI - Region of Interest

A Region of Interest (RoI) defines a region of the source image to enable filtering of detections based on their location in the image. For example, when only detections in front of the vehicle are relevant, detections on the far side can be removed.

The SDK defines the following types of regions:

  • Primary: This region type typically represents the ego lane
  • Secondary: This region type typically represents a region including adjacent lanes on either side of the ego lane

The SDK can generate an RoI via the following methods:

  • makeStaticRoi: A static RoI is generated based on vertices defined by the application

    Note

    The static RoI verticies must meet the following constraints:

    • The vertices must form a convex polygon
    • The x and y of each vertex must have a value between 0.0 to 1.0, representing a percentage of the image's width and height respectively
  • makeLaneRoi: The lane RoI is generated based on the output of the Road Lanes model
    • The primary region is defined by the 2 ego lane markings
    • The secondary region is defined by the adjacent lane markings, if present

Alerts

Live Sense SDK supports several types of alerts via the LSDAlertManager class.

Entry

Entry alerts signal to the application when a detection first enters the primary RoI region.

These alerts support pedestrians, to alert driver when a pedestrian has entered the road, and vehicles, to alert of another vehicle merging into the ego lane.

Lane departure

Note

This feature is in Beta.

A lane departure alert indicates the subject vehicle is leaving the ego lane and encroaching the adjacent region.

There are two severities of lane departure alerts:

  • WARNING: The subject vehicle is likely drifting from their lane rather than intentionally changing lanes.
  • INFO: The subject vehicle has departed the lane

Note

For lane departure alerts, the SDK assumes the following:

  • The camera is reasonably well centered within the vehicle and facing straight ahead.
    • If the camera is placed to far to one side of the vehicle, the SDK may generate false positive alerts.
  • The camera has a field of view similar to a phone's wide angle camera
    • Ultra wide lenses cause false positive alerts.

TTC - Time to Collision

Live Sense SDK provides the feature of alerting the user/driver of Time to Collision. This feature helps to avoid accidents and ensure driver safety. This feature is applicable to car, car-brake-light-on and pedestrian detections in the vehicle's expected path. Time to Collision, or TTC, is calculated as follows:

Time to Collision (T) = -1 * d / ( Δd / Δt ) when Δd < 0

  • d = distance to the object
  • Δd = change in distance to the object between processed frames
  • Δt = change in time between processed frames
  • Speed of the vehicle - provided by the device's location services. The instantaneous speed of the device, measured in meters per second from Apple CoreLocation Framework.

    Note

    If the speed is negative or infinite, the distance between the latest GPS coordinate reading and the previous is used to calculate speed by the formula speed = distance/duration in meters is used.

  • Distance to the object - provided using the StaticDistanceEstimator

The distance between each detection and the point of view is calculated and the alert is sent if the distance goes beyond the specified alertDistance.

This calculation only applies when the distance to an object is decreasing. Otherwise, TTC is infinite and no alert is generated.

LSDAlertManager contains extra heuristics for alerting the user of possible hazards in their path. This includes time-to-collision with the leading vehicle and people entering the vehicle's direct path.

The following list describes the different severity of alerts based on the TTC value:

  • INFO - If T > 2.5secs
  • WARNING - If 1.8secs < T <= 2.5secs
  • ALERT - If T <= 1.8secs

These are the default configured values. The values for WARNING and ALERT can be changed by the following properties of LSDAlertSettings:

  • timeToCollisionAlert (Default is 1.8s)
  • timeToCollisionWarning (Default is 2.5s)

For LSDAlertManager to fully function it requires the following input:

  • LSDAlertSettings - For more information on this method, see API Reference
  • The vehicle's current speed, normally provided by the device's location services, continuously fed in via currentSpeed.
  • The latest collection of LSDRecognizedObject continuously fed in via determineAlerts().
// Setup alertManager
let alertManager : LSDAlertManager = LSDAlertManager()
let entryTypes : [String] = ["car-brake-light-on", "pedestrian", "car"]
var recognitions :   [LSDRecognizedObject] =  [LSDRecognizedObject]()

// Enable entry alerts for test
let  alertSettings : LSDAlertSettings = LSDAlertSettings()
alertSettings.useOnEntry = true
alertManager.alertSettings = alertSettings

//Set Roi
var roiFilter : ROI = ROI()

let defaultRoiPoints = [
       Point2f(x: 0.10, y: 0.95),
       Point2f(x: 0.46, y: 0.15),
       Point2f(x: 0.54, y: 0.15),
       Point2f(x: 0.90, y: 0.95)
   ]

let currentPixelHeight = pixelBufferWidth
let currentPixelWidth = pixelBufferHeight

if let roi = roiFilter.makeStaticRoi(withWidth: Int32(currentPixelWidth), frameHeight: Int32(currentPixelHeight), primaryPoints: defaultRoiPoints, secondaryPoints: defaultRoiPoints){
  //Pass RoI to alertManager
               alertManager.setRoi(roi: roi)
  }


let objectRecognition = LSDRecognizedObject()
objectRecognition.uuid = UUID()
objectRecognition.label = "type"
objectRecognition.bbox = CGRect(x: 100, y: 100, width: 250, height: 250)
objectRecognition.frameTimeStamp = Date()
recognitions.append(objectRecognition)

// Add camera properties
let sensorSize = CGSize(width: 5.6, height: 4.2) //DISCLAIMER : iPhone XR/ 11
let cameraProperties = LSDCameraProperties(lensFacing: 0, sensorOrientation: UIDevice.current.orientation.isLandscape ? 90 : 0, sensorSize:sensorSize ,focalLength: Float(3.99), horizontalFov: 0.0, verticalFov: 0.0)
alertManager.cameraProperties = cameraProperties
// Entry type alerts
let alerts : [LSDAlert] = alertManager.determineAlerts(recognitions: recognitions, frameWidth: 640, frameHeight: 480, datetime: 1)

results matching ""

    No results matching ""