Björn Urban

  • About Me
  • Projects
  • Blog

Projects summarizing my professional experience with different frameworks and programming languages and technologies.

Developing Go Operator for large-scale DBaaS

The customer is a cloud provider that offers large-scale Database-as-a-Service for multiple database products like PostgreSQL, MongoDB, and MSSQL. The customer databases are run across a large number of different Kubernetes clusters. I was part of the engineering team that created the Kubernetes operators, written in Go, which manage every database product's infrastructure. We developed the existing unreliable operators from the ground up, defining a new, reliable, and extensible architecture, as well as thorough quality assurance with various testing levels from unit to end-to-end tests, thereby increasing maintainability and robustness.

Go
Helm
Kubernetes
SQL
MongoDB
PostgesQL
ArgoCD
Azure DevOps
Ansible

Customer in financial sector/B2B Payment Services

In this project for a client in the financial sector, the primary focus was on replacing the existing core software that processed a high volume of transactions and documents. This software plays a key role in the client's day-to-day operations, in document regulation, and due date determination – essential processes in the exchange between industry and trade. Additionally, payment transactions needed to be processed, necessitating strict compliance with compliance requirements.

The technology stack consisted of Kotlin/Java and Spring Boot, leveraging Kubernetes and the Azure Cloud as the platform. One aspect of the project, besides implementation, was the creation of architectural concepts that ensure scalability and robustness.

Furthermore, it was crucial to ensure compliance with complex tax and legal requirements, necessitating the development of efficient mechanisms for the creation and management of report documents. Moreover, comprehensive modernization and optimization of the existing code were advanced.

Java
Kotlin
Helm
Kubernetes
Azure
React
Spring
SQL

Digital Building documentation

The goal was to digitize the existing process of collecting building documentation data (metadata and documents). For this purpose, a React application was developed, which uses Microsoft SharePoint as the backend to simplify this process. Now, subcontractors have the ability to independently store data related to the trades they have installed. The project manager now only needs to validate the stored data after the documentation is completed. The filing structure and data schema were chosen to be standard-compliant, so that the data can also be used in other systems. The basic data was sourced directly from the CAD model.

React

Vehicle Data Collection and Analysis

This project involved the design of a Big Data pipeline and its prototypical implementation. After receiving the vehicle data (geo-coordinates), clustering of the coordinates was performed (implemented as a Spark batch job). Additionally, cluster assignment was carried out in a Spark streaming job. The pre-processed data were then analyzed using machine learning.

Spark
Scala

Real-time model free object tracking and speed estimation

In my thesis, I developed an approach for estimating velocity vectors of both static and moving objects in RGB-D images, which is critical for applications such as obstacle avoidance where understanding the dynamics of a scene is crucial. This was accomplished by first preprocessing the RGB-D data to isolate the objects by removing the floor from the scenes, which simplifies the tracking and categorization of objects in the image.

I employed a Voxel Region Growing algorithm for clustering scene elements based on their spatial locations, which aids in differentiating and tracking objects across consecutive frames. To associate objects across these frames, I adapted the Hungarian algorithm, enabling the precise computation of their velocity vectors. This method ensures that each object's trajectory is maintained, which is essential for the subsequent velocity calculations.

The velocity vectors are computed and validated against a ground truth to verify their accuracy. This validation is essential for ensuring that the vectors can reliably be used in practical scenarios, such as navigating through dynamic environments where accurate real-time responses are necessary.

I implemented the entire algorithm in C++ to leverage its performance capabilities, achieving cycle times of approximately 8 ms at a resolution of 848x480 pixels. This efficiency allows for real-time processing, which is crucial for the deployment in robotic systems and other applications requiring immediate reaction to changing conditions.

My thesis presents a comprehensive method for real-time velocity estimation using RGB-D images. The approach focuses on efficient data processing techniques, accurate object tracking across frames, and robust velocity computation, all of which contribute to the reliability and practicality of obstacle avoidance systems.

CPP
Python
OpenCV

KubeVoyage - An Kubernetes Authentication Proxy

KubeVoyage is a program that I set up to learn Go. Additionally, I wanted to test how well the programming functionalities of ChatGPT are suited for writing a completely new application from scratch. My motivation was to replace Basic Auth in Kubernetes clusters with a solution that is less complicated than full-fledged auth proxies like Keycloak, yet still provides a simple form of authentication, including a graphical user interface. One use case is to facilitate the deployment of non-public websites to third parties, for example, during the development phase. For this purpose, there is the option to generate access codes for external persons or to sign in via Single Sign-On through various providers, as well as by email and password, and then request access to a site. The tool does not have rights management; it is solely about providing non-public access to various sites as simply and flexibly as possible. Unlike Basic Auth, it is much easier to release different user data for different ingresses.

Go
Helm
Kubernetes

Cardmaster - A Doppelkopf Point Tracker

I really enjoy playing Doppelkopf, so I thought about building a browser-based score tracker to more easily track the various games in my card round. There are several apps in the App Store that allow you to play Doppelkopf, but we wanted a simple way to play together and track scores. I wanted to test the libraries Ktor and Koin for Kotlin, as well as the SurrealDB database, and I took this program as an opportunity to apply these technologies in a real use case. For this purpose, I have implemented my Point Tracker with this tech stack and actively use it. In the future, additional features are planned, such as special rule modifiers and customizable point weighting, as well as a dedicated SurrealDB client in Kotlin.

Kotlin
React

Tool for Generating Testdata for Vehicles

The goal of the project was to reestablish a test data tool that was used for generating artificial vehicle data, to specify and document the requirements. The tool, which was already extensively used, was supposed to be more robust and easier to use.

The test data were necessary to be able to test online services independently from the vehicle. In this process, various corporate systems and their associated datasets had to be connected and synchronized. Users from different departments and from various service providers had the opportunity to independently create synthetic test vehicles via REST or through a web interface.

Java
React
Spring

Doclytics

To reduce paperwork at home, I have set up Paperless-ngx to manage documents digitally. Although these are searchable, they are not particularly easy to categorize, which led me to the idea of using an LLM for this purpose. However, I did not want to send personal documents to OpenAI or ChatGPT. Therefore, I devised a solution on how to create a classification for the documents locally using an LLM and then automatically deposit it in Paperless. The goal of the tool is to extract metadata such as recipient, priority, etc., as automatically as possible. For this purpose, I built a Rust program that uses the Ollama API, packaged it as a container to provide a fast and secure way of document classification.

Rust
LLM

Portfolio Website

My personal portfolio website. I have used NextJS to leverage server-side rendering. To easily maintain the content, I use Directus as a CMS. Both the site and the CMS run on a self-hosted and self-managed cluster. The website serves to showcase my personal resume, project experience, and skill set, and personally serves as an experimental project for frontend development and design. The website is a constant work in progress, regularly updated with design and functionality improvements.

Kubernetes
React
Typescript
Pulumi

Cluster Based Computation of Prefix Hijacking Events and their Consequences

The objective of the project work was to estimate the effects of BGP prefix hijacking events using a heuristic algorithm we developed. For this purpose, a routing graph was created on a high-performance cluster based on BGP archives, and costs were subsequently assigned to the edges to approximate a routing partitioning.

CPP
Docker

Setup of Big Data Development-Toolchain

The toolchain helps reduce the effort involved in testing Big Data applications. It simulates all components of the production environment using Docker containers. This allows for local testing of, for example, Spark jobs. To start the environment, only a Bash script needs to be executed.

Kafka
Spark
Hadoop
Docker

TCP Interface for Closed Loop Vehicle simulation

The goal was to connect the Carmaker simulator with an internal simulator to test closed-loop vehicle software. The simulation data were serialized using Protobuf and then streamed to the target simulator via TCP.

CPP

Python Tool for Code Generation from Templates

The Python tool was tasked with generating C++ files from templates. The tool also automated the setup and build process of the resulting library.

CPP
Python

IoT System with Azure and Embedded Devices

The goal of the internal project was to build an IoT architecture that utilizes the Azure IoT Hub. For this, I analyzed various message formats and set up test devices that were provisioned with their own Linux image. This was built with Yocto and orchestrated with Mender. Among other tasks, I wrote the firmware for the ESP32 devices and set up the certificate-based communication of the devices with the IoT Hub. Additionally, I set up Hashicorp Vault and my own CA. I also developed a concept for device provisioning with the certificates as well as the ability to do Over-the-air updates.

CPP
Azure