Having a home dashboard was one of my objectives recently, and besides the fun aspect of coding, I wanted something a bit futuristic, you know, like what we see in many movies or tv shows.
Let’s see what we got for this first version!
It might be evident for some of you, but the screenshots and videos you’ll see in this article display real data. This is not a demonstrator or a mock-up.
Genesis
Every morning is the same ritual, checking the weather forecast, the temperature and air quality in the apartment, scheduling the vacuum and mop cleaner to do its duty, checking the next bus or tramway to go to work, the agenda of the day, etc…
For each of these tasks, we have connected devices or apps. Each one is operated by its own company, which means several authentication processes, different user interfaces, etc.
This was the first goal of this dashboard, squashing all of these data on the same screen.
The second goal was to have material similar to what we see in tv shows or movies. I’m a huge fan of Sci-Fi and Cyberpunk stuff: Altered Carbon, Cyberpunk 2077, Foundation, Stargate, The Expanse, you name it.
You always have shiny user interfaces in those shows, futuristic and smooth. We all want that. Still, we are in 2022 very soon, and we are not even close to what we can imagine in movies. Technically speaking, we can do it, but in general, this is not what is generating profit in business. This is why we often have old-school and clunky user interfaces, even in industries like aerospace. The main goal is to have rocket launchers, satellites, and probes not crashing, not bragging with shiny user interfaces.
Even if companies like SpaceX are starting to change the game.
Lady of Shalott?

When I started to work on the first version, we were under strict lockdown due to the pandemic outbreak. It reminded me this tale where a woman is cursed and condemned to stay in a tower, seeing the outside world only through her mirror.
Since the first version of this dashboard was aimed to be embedded in a mirror, it made a lot of sense contextually speaking!
The first version
Let’s see the result!
I know we are in the middle of the article, but I’ll talk about tech in the following chapters, so it is probably better to see the result now if you are not a tech person.
You might be interested in the last chapter [What’s Next?], though!
So here what it looks like for the first version, with, actually, one of the alarms ringing because of the air quality:
And another video of the dashboard in production, at the entrance of our apartment:
A screenshot with a better resolution than the two previous videos (click to zoom):
Let’s talk about the tech behind
I didn’t use exotic tech here. All of what I used is relatively standard in modern web development.
This is a simple web application with a javascript client interpreted by a browser, calling an HTTP server responsible for collecting and cleaning the data.
Everything runs locally in the apartment, and it is not accessible outside.
Client
First of all, I’m not a huge fan of modern web technologies like React, Vue, and co. I think it became too complicated because of the browsers and javascript’s underlying foundations. We tried to build advanced user interfaces technologies from outdated ones, and we now have to deal with them. From a personal point of view, I find it difficult to trust what we build with that.
But still, they remain suitable technologies because they allow us to build advanced stuff. This is what matters in the end.
Basically, the frontend has been made using simple React, without stuff like redux. It is written in javascript, no typescript, no real reason behind this choice except keeping a simple toolchain.
I relied on Arwes and ApexCharts to have something a minimum beautiful!
For the first version, the result matches more or less my expectations. But with the increasing number of animations and updates of the data, it feels a bit stuttering sometimes. For an optimal and the smoothest experience possible, I’ve taken a look at WebGPU. Still, the technology itself is not ready, and there are too few libraries supporting this.
Server
The server is doing two kinds of tasks:
- Retrieving data on demand. For example, the news or the weather forecast APIs are called by the server only if the client asks for it, which it does periodically
- Retrieving data regularly to be prepared when the client asks for them. For example, the data related to the indoor environment of the vacuum robot are retrieved every x minutes and refined to be ready when the client asks for it
The server is written in Go, very few external libraries are used. The one provided by the standard library is used for the HTTP server. For the asynchronous tasks used to fulfill the second point described above, go-cron is used.
I needed a network discovery tool to develop the presence feature, for that, I’ve used Go nmap bindings.
Some data were not accessible through an API. This is the case for nearby transportation; I have to scrape web pages for this kind of data. I used colly to do so!
One of the most tricky parts was the communication with the vacuum robot. I didn’t want to implement all the communication protocols by myself. Still, I didn’t find any suitable library in Go to do it (miio protocol implementation with the support of Roborock S7).
After finding a suitable library in Python, I’ve developed a small module responsible for collecting the vacuum data and drawing the map with the last known path. I’m not used to this ecosystem, but there is such an amount of support and documentation everywhere on the web that it was not that difficult in the end. This python component is used by the server directly.
Where does the data come from?
Surprisingly, most of the data come from dedicated API:
- Netatmo API for indoor sensors
- PCloud API for cloud drive
- Google Calendar API for agenda, with a dedicated Go SDK
- News API for top headlines
- Open Weather API for weather forecast
The most complex data to retrieve were related to the vacuum robot (Roborock S7), especially the map, and the public transportation data. For the latter, I tried to reverse-engineer the API used by the mobile app, but when I saw its complexity, to retrieve the next passages of the tramway, I preferred to fall back on the good old web scrapping method for their website.
What’s next?
Since the foundations are in place, it is now relatively easy to add new components to the dashboard if the data related to this new component are easily accessible, of course.
And I already have new components! But the main issue is that they are related to very personal data (bank, investments, health data…). They can’t be displayed at the entrance of our apartment like that. We can’t show them to our friends and family when they come home.
This is why I want to implement a mechanism to switch to another panel. Ideally, I want Audrey or me to be able to swipe our hand in the air to do so; it implies plugging a camera and detecting the gesture. I’ve already made some tests, and it can be promising!
More generally, I want to make this dashboard evolve to a home assistant, talking to us and answering our requests. I’ll dig the existing open-source assistants for that; perhaps something cool is possible!
Last but not least, something even more futuristic looking. I’m not a designer, so I try to get inspired here and here. This first version is still web-based, with a lot of symmetry and boxes. I want something more alive, graphically speaking.
Aaand that’s all for today, folks! There is a lot more to say, but probably in a future article on the second version of the dashboard.