Services

We are surrounded by millions of sounds that contain important clues about our environment. For instance, if you hear someone screaming, you know that there is an emergency. If you hear a siren, you know an emergency vehicle is approaching. However, while humans are adept at contextualizing sounds, computers are still lagging in this regard. How great would it be if AI could understand sounds as well?

Cochl.Sense allows computers to understand what’s going on around them by enabling them to listen to their surroundings. By simply sending an audio file or audio stream from a microphone, our system will let you know what is happening.

Cochl.Sense is split into different categories :

Emergency Detection

The following events can be detected when creating an emergency project in the dashboard.

  'Fire_smoke_alarm'
  'Glass_break'
  'Gunshot'
  'Scream'
  'Siren'
  'Others'

Human Interaction

The following events can be detected when creating a Human Interaction project in the dashboard.

  'Double_clap'
  'Finger_snap'
  'Knock'
  'Whistle'
  'Others'

Human Status

The following events can be detected when creating a Human Status project in the dashboard.

  'Burp'
  'Cough'
  'Fart'
  'Hiccup'
  'Laughter'
  'Sigh'
  'Sneeze'
  'Snore'
  'Yawn'
  'Others'

Home Context

The following events can be detected when creating a Home Context project in the dashboard.

  'Baby_cry'
  'Dining_clink'
  'Dog_bark'
  'Electric_shaver_or_toothbrush'
  'Knock'
  'Toilet_flush'
  'Water_tap_or_liquid_fill'
  'Others'