Objective

Presence based light automation is probably one of the hardest challeges of smart home. Even after few years of trying, I still couldn’t get them right. The ultimate dream? To ditch manual switches entirely, just because it’s cool.

When I was building my house, I gave the electrician a bag full of Shelly WiFi relays for all the light fixtures. He installed them into junction boxes and behind regular switches. Decision to go with de-centralized solution for this rather than a centralized KNX system probably wasn’t the best choice. But at the time, I wanted to play it safe and stick with traditional wiring that would made it easy to swap out switches if I gave up. Even though the whole setup is stable, I do have to replace a Shelly switch every now and then.

Initially, the main use-case was just ability to shut down all switches at once. For example, tapping an NFC sticker in the hallway with my phone turns off all the lights, the TV, and the radio when it’s bedtime. Everything also shuts down automatically when everyone leaves the home. Even these two use-cases were enough to justify investment of about about 1000€ for the switches and installation.

But, I was still flicking switches on and off manually when I walked in and out of rooms — super annoying and inconvenient. To really nail lighting automation, the system needs to know exactly where people are. From that aspect, rooms tend to fall into two buckets:

  • “Active” rooms like hallways, wardrobes, and laundry rooms where you just walk in or move through.
  • “Passive” zones like living rooms and bathrooms where people tend to chill a bit and not move much. For example, lying on the couch watching Netflix or sitting in dining room working on a blog post.

Active zones are a good fit for cheap infrared PIR sensors. They’re quick and effective at picking up movement but can’t keep track of someone just sitting around. I used to use 433 MHz RF motion sensors, but I’m starting to switch to Zigbee sensors.

Passive zones are trickier. Regular PIR sensors can’t spot someone sitting still, so lights might turn off when you don’t want them to. At first, I tried using a couple of cheap indoor cameras to cover these areas — a kitchen and living space combo of about 50 square meters. The cameras stream footage to Frigate, an open source NVR software that has ability to detect presence of a person from video stream in real time and also forwards this information to Home Assistant which can then turn on or off light. But if someone’s partly hidden or under a blanket, Frigate won’t be able to recognize this person and lights will still turn off.

Here is demonstration of Frigate detecting person (me) and Home Assistant turning on the light upon me entering the zone and then turning it off after I leave.

To fix issues with occasional loss of tracking, I made two mmWave sensors that can catch even tiny movements, like breathing. They’re more expensive (PIR = 10€, mmWave = 50€), but I managed to DIY my own with cheap mmWave modules and a microcontroller, bringing the cost down to about 10€ each. These sensors work together with the cameras to make sure lights only go out when absolutely no one’s around.

Plus, external light sensors help avoid unnecessary lighting during the day, and a separate system makes sure any lights kids turn on during the day are switched off quickly. This setup also automatically turns off the TV if no one’s in the living room—my family isn’t thrilled about it, but it’s a win for saving energy.

Moreover, external illuminance measurements prevent unnecessary lighting during daylight and ensures lights turned on by children during the day are promptly shut off. This setup also automatically turns off the TV when no one is present in the living room, a measure not entirely popular with the family but essential for energy efficiency.

Hardware

All lights are managed by Shelly WiFi switches.

The indoor cameras are from chinese company Yi, chosen for their affordability and compatibility with the Yi-hack firmware, which disables connection to Chinese cloud and adds support for RTSP. This allows video feed integration into any NVR system, in my case into Frigate open source NVR software.

Frigate supports person detection, but requires a dedicated computer vision accelerator module Google Coral.

Homemade microwave presence sensors utilize an ESP32 microcontroller paired with either LD2420 or LD2410 sensors. There are also commercial options like the Aqara FP2, which also seem pretty good.

Integration in Home Assistant

Frigate feeds real-time presence data into Home Assistant using integration provided by Frigate developers. It provides presence data per-zone, if defined. Here is example for our dining room:

The ESP32 with LD2420 uses ESPHome firmware which natively integrates to Home Assistant. Here are some exposed entities:

Automation Logic

The automation consists from two parts:

  1. Dual tracking of presence using both the camera and the mmWave sensor LD2410, implemented in Appdaemon.
  2. Controlling light states based on presence detected with above method.

The Appdaemon app that detects and tracks presence does the following:

  1. Each room or zone has the following properties:
    • name of output entity, which represents presence (composite_entity)
    • name of presence state entity of mmWave sensor (mmwave_sensor)
    • name of presence state entity of video based person recognition (cv_sensor)
  2. Upon person detection by video (cv_sensor), the sets value of on to output entity composite_entity and initiates a verification process by calling function check() after 10 seconds.
  3. The verification process check() checks the status of both detection methods:
    • If neither detects presence, the output entity composite_entity is set to off
    • If at least one indicates presence, a recheck using check() occurs in 5 seconds, continuing until both agree on absence.

This recursive verification ensures reliable presence detection before altering light statuses.

Appdaemon app:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
import appdaemon.plugins.hass.hassapi as hass

JEDILNICA_CONFIG = {
    "composite_entity": "binary_sensor.composite_occupancy_jedilnica",
    "mmwave_sensor": "binary_sensor.ld2420_jedilnica_kuhinja_presence",
    "cv_sensor": "binary_sensor.jedilnica_all_occupancy"
}

KUHINJA_CONFIG = {
    "composite_entity": "binary_sensor.composite_occupancy_kuhinja",
    "mmwave_sensor": "binary_sensor.ld2420_jedilnica_kuhinja_presence",
    "cv_sensor": "binary_sensor.kuhinja_all_occupancy"
}

DNEVNA_CONFIG = {
    "composite_entity": "binary_sensor.composite_occupancy_dnevna",
    "mmwave_sensor": "binary_sensor.ld2410_dnevna_kavc_presence",
    "cv_sensor": "binary_sensor.dnevna_all_occupancy"
}
class Occupancy(hass.Hass):
    def initialize(self):
        self.set_state(JEDILNICA_CONFIG.get('composite_entity'), state='off')
        self.listen_state(self.on, JEDILNICA_CONFIG.get('cv_sensor'), new='on', config=JEDILNICA_CONFIG)

        self.set_state(KUHINJA_CONFIG.get('composite_entity'), state='off')
        self.listen_state(self.on, KUHINJA_CONFIG.get('cv_sensor'), new='on', config=KUHINJA_CONFIG) 

        self.set_state(DNEVNA_CONFIG.get('composite_entity'), state='off')
        self.listen_state(self.on, DNEVNA_CONFIG.get('cv_sensor'), new='on', config=DNEVNA_CONFIG)

    def on(self, entity, attribute, old, new, kwargs):
        config = kwargs.get('config')
        composite_entity = config.get('composite_entity')
        if self.get_state(composite_entity) == "on":
            return
        self.set_state(composite_entity, state='on')
        self.run_in(self.check, 10, config=config)

    def check(self, kwargs):
        config = kwargs.get('config')
        if self.get_state(config.get('mmwave_sensor')) == "off" \
        and self.get_state(config.get('cv_sensor')) == "off":
            self.set_state(config.get('composite_entity'), state='off')
        else:
            self.run_in(self.check, 5, config=config)

Second part is turn on or off action based on composite entity which is defined by code above. To achieve this, I’m using basic Home Assistant automations.

Example for turning off lights in dining room. Similar automations are also defined for each room, for both turning on and off.