I spent money on a domain so I might as well use it.

Self Hosting a Dog Camera for Away From Home Monitoring

I try to have my cake and eat it to, by self-hosting a camera setup to check in on my new dog while I am away.

GenAI note: While I'm practicing my long form writing and do not use any GenAI tooling for writing posts, sometimes I use GenAI on side projects for learning about and setting up new tools. In this post I prompted Claude when I had issues with my configuration, so the full example config at the end is not 100% written by me.

My partner and I adopted an elderly dog this past week. She is a wonderful bean!

I have friends that use a camera to check in on their dog when they are away. That would be convenient, but I feel uneasy though adding video surveillance to the inside of my own house. So part of the challenge with this project was to see if I could:

  1. Use an affordable camera.
  2. Block the cloud integration and telemetry that makes it affordable.
  3. Self host my own alternative.
  4. Make sure the cameras are not active when I am home.
  5. Continue to keep my total power usage down. [^ This has been an overarching goal for me with self-hosting.]

When it comes to self-hosting video cameras, Frigate is the big name in town. It's a really cool and flexible project, but due to how much it can be customized, setting up the configuration can be intimidating. My recommendation, and the docs, is to start small and build up. So I'm going to try to simulate how I did that in this post as an example for others to learn from.


Installing Frigate:

Frigate can be installed as a Home Assistant OS addon, however the docs state:

Frigate runs best with Docker installed on bare metal Debian-based distributions. For ideal performance, Frigate needs low overhead access to underlying hardware for the Coral and GPU devices. Running Frigate in a VM on top of Proxmox, ESXi, Virtualbox, etc. is not recommended

I currently have HaasOS installed in a VM in Unraid, so I did not use the addon. Instead I used the Unraid community app. When you start it up the first time it will generate an example config.yml that will not work because, well, its just an example.

Even though I didn't use the addon, I still plan to integrate with Home Assistant. So I had to do the following:

  • Install the MQTT addon. This handles both setting up an MQTT instance for you and integrating MQTT into Home Assistant.
  • Create a dedicated user in Home Assistant just for MQTT as the docs recommend.
  • Install the Frigate Integration for Home Assistant. This also requires HACS to be installed.

Once I had the pieces in place I could start setting up Frigate!


Minimum Configuration For Frigate:

Before my cameras or the Coral arrived I was eager to set up something while I waited. When you start up Frigate for the first time it will generate an example config.yml that will not work because, well, its just an example. So I ignored its contents and started fresh.

Based on what I could figure out, at a minimum the config that works needs two things:

  • An MQTT entry with the connection information
  • A camera entry. This can contain an empty collection of cameras though.

Bare minimum example:

mqtt:
  host: <IP of the Haass Host>
  user: "{FRIGATE_MQTT_USER}"
  password: "{FRIGATE_MQTT_PASSWORD}"
cameras: {}

Note: You can provide environment variable substitution for some but not all values in our configs. The docs say to use the reference config as an example to know which ones.

With this configuration I was able to start up the Frigate container without any errors. After you run Frigate for the first time you will see that it appends something like version: 0.15-1 to the end of your configuration. It seems this is added to handle any migrations that are needed in the future.


Adding In a Camera:

There are lots of options for cameras, especially if you want to spend money. Luckily I only need to monitor a small space and its inside so the lighting is pretty controlled. If I needed a camera for outside and operating at a distance it would be a totally different story. The documentation has a recommended camera hardware section if you want a perfect setup. I did not, I went for the cheap option.

I grabbed two TP-Link Tapo C120 [^ One downside of the camera is that the LED is not hardwired to forcibly turn on when the camera is recording.] cameras because:

  • They did not cost much.
  • You can set up a local account on them that does not need the cloud account to log in.
  • Even though they are not wired, my APs setup can handle the additional load.

Note: The Github readme for Frigate lists the video sources that the project support. Frigate can use the proprietary Tapo camera protocol instead of RTSP. In fact this is the only way to use the 2-way audio feature the camera has. However this requires using the cloud account and not the "camera" account so its a no-go for me.

Anyways, once I physically set the cameras up in my house, did the app onboarding, updated the firmware, etc, I changed the following in the app:

  • Set a name for the camera.
  • Under Advanced Settings for each camera I added a "camera account" with a username and password.
  • Under Network Settings I enabled a static IP address.
  • Disabled all the built in detection and alerts. [^ I'm not sure how much power those features consume, but I won't be using them anyways.]

Once that was done I started working on configuring Frigate to work with a single camera. The new pieces I added to my config were:

  • A camera entry under cameras with an name of your choice like tp_c120_0.
  • Detect only dogs.
  • Use different streams from the camera for detection and recording.
mqtt:
  host: <IP of the Haass Host>
  user: "{FRIGATE_MQTT_USER}"
  password: "{FRIGATE_MQTT_PASSWORD}"

cameras: 
  tp_c120_0:
    ffmpeg:
      inputs:
        - path: rtsp://{FRIGATE_RTSP_USER}:{FRIGATE_RTSP_PASSWORD}@<IP of camera>:554/stream1
          roles:
            - record
        - path: rtsp://{FRIGATE_RTSP_USER}:{FRIGATE_RTSP_PASSWORD}@<IP of camera>:554/stream2
          roles:
            - detect
    detect:
      width: 640
      height: 360
      fps: 5
    objects:
      filters:
        dog:
          min_area: 1000
          max_area: 100000
		  threshold: 0.7

Now when you add this you will see in your logs that its using a CPU based detector which is only for testing, but we can remedy that.


Hardware Accelerated Object Detection:

There are lots of options available for object detection devices. The recommended hardware page provides some opinions. It also has this nice explanation:

A detector is a device which is optimized for running inferences efficiently to detect objects. Using a recommended detector means there will be less latency between detections and more detections can be run per second. Frigate is designed around the expectation that a detector is used to achieve very low inference speeds. Offloading TensorFlow to a detector is an order of magnitude faster and will reduce your CPU load dramatically.

Essentially Frigate has two passes where:

  • First it looks for motion in a frame.
  • If there is motion then the frames with motion are passed off to the hardware for object detection.

This means that Frigate does not need to do object detection on every single frame from all the cameras.

In terms of hardware for speeding up detection you could:

  • Use the integrated graphics, but that tends to put high load on your CPU. [^ In fact before I setup up hardware for this the Frigate dashboard gave me a warning about high CPU usage.]
  • Use a dedicated Nvidia graphics card, but that goes against my goal of keeping the power draw on my self-hosted hardware down.
  • Use specialized hardware that is really power efficient.

For special hardware I started off looking at the Hailo accelerators, but they are quite pricey. I initially looked at Halio over Google's Coral hardware because Google has not refreshed them for several years. However the Frigate documentation strongly encourages them since they are cheap if you can still find them. [^ Mouser actually had a bunch of stock when I looked!]

I went with the Coral Dual Edge TPU which was only ~40 plus tax, shipping, and tariffs. The Dual Edge version is, as expected, is two TPUs slapped together. [[^ This squared the output of a single GN Drive] It uses the M.2 E key slot that a Wi-Fi card would normally go in. On my board the E key slot is sadly under an NVME SSD, which I don't really like from a thermals perspective but we will see.

Once I inserted the card I just needed to install the Unraid "Coral Accelerator Module Drivers" in the Apps center and update the Frigate container to run as privileged so it has the proper hardware access. Once that was done I followed the docs on configuring a detector:

  • Add a detector entry which describes the Coral hardware to use.
  • Enable ffmpeg hardware acceleration. [^ I'm on an Intel CPU so I'm using the quick sync option.]
  • Tell Frigate how to encode the video we want to detect from.
  • Tell Frigate what kind of objects we want to track.
mqtt:
  host: <IP of the Haass Host>
  user: "{FRIGATE_MQTT_USER}"
  password: "{FRIGATE_MQTT_PASSWORD}"

detectors:
  coral:
    type: edgetpu
    device: pci:0

ffmpeg:
  hwaccel_args: preset-intel-qsv-h264

detect:
  width: 640
  height: 360
  fps: 5

objects:
  track:
    - dog

cameras: 
  tp_c120_0:
    ffmpeg:
      inputs:
        - path: rtsp://{FRIGATE_RTSP_USER}:{FRIGATE_RTSP_PASSWORD}@<IP of camera>:554/stream1
          roles:
            - record
        - path: rtsp://{FRIGATE_RTSP_USER}:{FRIGATE_RTSP_PASSWORD}@<IP of camera>:554/stream2
          roles:
            - detect

Notice I moved detect and objects to the top level. You can put them under each camera if you want them to have different settings but in my case I do not. Defining them globally reduces duplication.


What resolution to stream?

You'll notice that for the detection process I only have 640x360 for the resolution. This is plenty for the detector and will improve the rate the detector can run. However that resolution leaves a lot to be desired for the live feed. Luckily Frigate bundles go2rtc which helps us get around this. The documentation for setting-stream-for-live-ui was helpful for this.

Essentially I had to:

  • Add a new go2rtc section with the rtsp connection strings.
  • Update the camera entry path to use the new go2rtc connection information.
  • Mark the camera inputs with input_args: preset-rtsp-restream.
  • Add a live section to the camera telling it to use go2rtc.
mqtt:
  host: <IP of the Haass Host>
  user: "{FRIGATE_MQTT_USER}"
  password: "{FRIGATE_MQTT_PASSWORD}"

detectors:
  coral:
    type: edgetpu
    device: pci:0

detect:
  width: 640
  height: 360
  fps: 5

ffmpeg:
  hwaccel_args: preset-intel-qsv-h264

objects:
  track:
    - dog

go2rtc:
  streams:
    tp_c120_0:
      - rtsp://{FRIGATE_RTSP_USER}:{FRIGATE_RTSP_PASSWORD}@<IP of camera>:554/stream1
    tp_c120_0_sub:
      - rtsp://{FRIGATE_RTSP_USER}:{FRIGATE_RTSP_PASSWORD}@<IP of camera>:554/stream2

cameras: 
  tp_c120_0:
    ffmpeg:
      inputs:
        - path: rtsp://127.0.0.1:8554/tp_c120_0
          input_args: preset-rtsp-restream
          roles:
            - record
        - path: rtsp://127.0.0.1:8554/tp_c120_0_sub
          input_args: preset-rtsp-restream
          roles:
            - detect
	live:
      stream_name: tp_c120_0

Adding a Second Camera, Detection, Alerts, and Recording

Finally on the Frigate config side there are two pieces left:

  • Enable recording, alerts, and detections with a record section.
  • Make sure we only run detection when motion is detected.
  • Add a second camera to the configuration.

So the final configuration looks like:

mqtt:
  host: <IP of the Haass Host>
  user: "{FRIGATE_MQTT_USER}"
  password: "{FRIGATE_MQTT_PASSWORD}"

detectors:
  coral:
    type: edgetpu
    device: pci:0

detect:
  width: 640
  height: 360
  fps: 5

ffmpeg:
  hwaccel_args: preset-intel-qsv-h264

objects:
  track:
    - dog

record:
  enabled: True
  retain:
    days: 3
    mode: motion
  alerts:
    retain:
      days: 30
      mode: motion
  detections:
    retain:
      days: 30
      mode: motion

go2rtc:
  streams:
    tp_c120_0:
      - rtsp://{FRIGATE_RTSP_USER}:{FRIGATE_RTSP_PASSWORD}@<IP of camera>:554/stream1
    tp_c120_0_sub:
      - rtsp://{FRIGATE_RTSP_USER}:{FRIGATE_RTSP_PASSWORD}@<IP of camera>:554/stream2
	tp_c120_1:
      - rtsp://{FRIGATE_RTSP_USER}:{FRIGATE_RTSP_PASSWORD}@<IP of camera>:554/stream1
    tp_c120_1_sub:
      - rtsp://{FRIGATE_RTSP_USER}:{FRIGATE_RTSP_PASSWORD}@<IP of camera>:554/stream2

cameras: 
  tp_c120_0:
    ffmpeg:
      inputs:
        - path: rtsp://127.0.0.1:8554/tp_c120_0
          input_args: preset-rtsp-restream
          roles:
            - record
        - path: rtsp://127.0.0.1:8554/tp_c120_0_sub
          input_args: preset-rtsp-restream
          roles:
            - detect
	live:
      stream_name: tp_c120_0
  tp_c120_1:
    ffmpeg:
      inputs:
        - path: rtsp://127.0.0.1:8554/tp_c120_1
          input_args: preset-rtsp-restream
          roles:
            - record
        - path: rtsp://127.0.0.1:8554/tp_c120_1_sub
          input_args: preset-rtsp-restream
          roles:
            - detect
	live:
      stream_name: tp_c120_1

With that the cameras are all set up and working! Though this is really only scratching the surface of what Frigate offers. There are features like semantic search that I haven't even tried out yet.


Securing the Cameras:

OK, so the cameras were working and all wired up to Frigate. Now I just needed to sever them from the cloud. I couldn't use the unbind functionality in the app because:

"Once the device is removed from the app, the device would be reset to the factory default settings by default, and you need to reconfigure it from the very beginning if you want to control the device on the app again."

Instead I had to block the cameras from accessing the internet. In an ideal world I would setup VLANs to help me with this. Unfortunately I own have unmanaged switches [^ What can I say they use less power than managed switches.] so I can't use VLANS.

I do have an Opnsense router though! So I blocked the cameras from accessing everything on WAN:

  • I made an alias in Opnsense with the camera IP addresses
  • Made two new Firewall LAN rules: (1) A block action with the new alias as the source and "any" as the destination. [^ Optionally I could make another rule above this one to allow UDP port 123 for NTP.] (2) the same but with the source and destination flipped.
  • Made sure the new rules were before the "Default allow any to..." rules.
  • Cleared the firewall state for the cameras' IP addresses

And with that the Tapo app can't access my cameras but Frigate can!


So That's It?

No I still have some work to do but that's enough setup for me to view the cameras in Home Assistant and check in on the dog when I'm away from home.

One of my requirements is missing though! That's because the ability to dynamically enable and disable cameras is coming in the next release, 1.6.0. Right now I can only toggle the detection and recording in HA, not the actual video stream. Once that release drops I can have the final piece I'm looking for!