Thomas
von der Ohe

Co-Founder, CEO

WHAT DRIVES ME

You come across an opportunity to positively impact societies around the globe maybe once in your lifetime. An opportunity to truly make a difference. Our unique approach to autonomous driving gives us this chance: redefining how people move in a better way. Building and shipping a product with this great team is what drives me day and night.

BACKGROUND

Launched Zoox’ first self-driving vehicle on public streets as leading Technical Program Manager in Silicon Valley, launched Amazon’s first Echo as leading Technical Program Manager on Device Software, Founded two (funded) mobility companies. M. Sc. Management Science and Engineering, Stanford University.

Fabrizio
Scelsi

Co-Founder, CTO

WHAT DRIVES ME

Building products together with an amazing team based on cutting-edge technology to serve a greater purpose and solve problems – for the people, for our planet.

BACKGROUND

Manager of an engineering team in Silicon Valley to build autonomous shuttles, built teams and various mobility products: electric race cars, e-motorcycles, light electric vehicles, electric passenger vehicles, including one of the most successful electric delivery vehicles in Germany. RWTH Aachen, Imperial College London.

Bogdan
Djukic

Co-founder, VP Engineering & Teledrive Experience

WHAT DRIVES ME

Vay is aiming to launch the first vehicle without a safety driver on public roads in Europe. This involves exciting engineering challenges, many of which have never been worked on before, ranging from autonomous vehicle technology, cybersecurity, backend, machine learning and safety-critical SW. Coming up with engineering solutions to these topics is something that I’m super excited to work on at Vay.

BACKGROUND

Team Lead at Microsoft, Senior Software Engineer at Skype. M. Sc. in Computer Science from Belgrade University.

Mariona
Bosch

VP Programs and Engineering Operations

WHAT DRIVES ME

Working closely with people from different cultures and professional backgrounds (hardware, software, operations, etc.) gives me the chance to learn new ways of approaching projects, structuring teams, and setting up processes every day. The results of this incredible teamwork are hugely rewarding, and visible in each step of our product.

BACKGROUND

Part of the management circle at AUDI AG, responsible for the implementation of prototypes at early development stages of new products (innovation vehicles, concept and pre-series vehicles, show cars, design models, testing single parts, PoCs, 3D-printing)

Justin
Spratt

CBO - Business & Corporate Development

WHAT DRIVES ME

What drives me is to work on goals that have a big impact on society. Additionally, I wanted to work with the smartest and most innovative people in the tech world. That’s better at Vay than any other company I’ve spoken to recently.

BACKGROUND

Responsible for global strategic partnerships at Uber, where he focused on deals with automotive OEMs, vehicle fleet operators, vehicle battery technology companies and electrification infrastructure providers. Was CEO and chief growth officer at Quirk. Began his professional life at Morgan Stanley as a fixed income trader after studying economics and finance. Built the first startup incubator in Africa in 2002 and has been mentoring founders of technology startups for over ten years. He is an angel investor in software technology and holds board positions in some of these companies.

Irene
Molins

Director of People

WHAT DRIVES ME

Driven by a passion for fostering vibrant organizational cultures, I am a seasoned leader specializing in People and Culture. Throughout my career journey, I’ve championed initiatives to empower teams, from crafting recruitment strategies aligned with core values to implementing data-driven processes for optimal People Operations.

BACKGROUND

My background spans roles where I’ve spearheaded the establishment of People departments from scratch and led teams through significant growth. I thrive on continuous learning, embracing remote/hybrid cultures and cutting-edge HR technologies. I leadled innovative solutions to optimise processes and initiatives that rewarded InfoJobs a “Best Place to Work”.
Outside of work, my passions for mountains and my dog, reflect my eagerness to explore the wild and love for 4-legs companions.

David
Gossow

Senior Principal Software Engineer

WHAT DRIVES ME

After having worked in autonomous robotics research for a long time in the Silicon Valley, I am thrilled to be working at a company that is finally bringing this technology into people’s everyday lives.

BACKGROUND

Tech Lead at Google Tango in Mountain View, Research Engineer at Willow Garage, yoga instructor since 2018.

Johanna
Loomis

Lead Industrial Designer

WHAT DRIVES ME

What drives me at Vay is the enthusiasm and passion for a vision that is embedded in the company culture, and at the same time the strict discipline in its implementation.

BACKGROUND

Industrial Product Design Lead at TEAMS DESIGN GmbH for over six years, started as a Trainee in Shanghai City. Worked as Industrial and Product Designer at LOTHAR BOHM ASSOCIATES LIMITED, Lutz Herrmann Design, Indeed Innovation and Werksdesign.

Vladimir
Bilonenko

Director of Software Engineering

WHAT DRIVES ME

Helping engineers to do their best and most important work. Elegance in software. Bringing ideas from books to real life and from one domain to another. Going from A to B fast.

BACKGROUND

Software Generalist. Maps and Mobility Geek (Lon, Lat not Lat, Lon). High Load at Yandex, Geo Analytics and Last Mile at HERE Maps, Mobility Platform at Daimler. Conway’s Law Enthusiast.

ALINA
PRESTI

Teledriver

WHAT DRIVES ME

A car enthusiast, driven by cars, driving and technology.

BACKGROUND

Nursery school teacher. Driver at Skoda’s start-up Caredriver.

Claire
Eagan

Director of Legal

WHAT DRIVES ME

I’m passionate about leveraging technology to democratize transportation and revolutionize the mobility landscape.

BACKGROUND

Senior Counsel Product & Strategy at Uber handling catastrophic loss claims and litigation, 13 years of experience in product & safety advising and risk mitigation, J.D. University of Illinois at Chicago.

How to measure glass-to-glass video latency?

by Bogdan Djukic (Vay Co-Founder)

Introduction

At Vay, we deeply believe that there is a better way to optimize utilization rate of car sharing and ride hailing services. We are building a unique mobility service which will challenge private car ownership and introduce a new way of moving in big urban environments. The underlying technology empowering this new approach is teleoperation (remote vehicle control). Vay will be the first company launching a mobility service which is capable of delivering empty vehicles to our customers at the desired pick up location and taking them over once the trip is over. Our vehicle fleet is remotely controlled by remote drivers (teledrivers) who can be at the same city, different city or even in a different country compared to our customers.

As you might imagine, one of our core modalities is ultra-low latency video streaming. In order to be able to safely control what essentially is a 1.5 ton robot at higher speeds on public streets without a safety driver, we need to be able to achieve glass-to-glass video latency below 200ms.

What is glass-to-glass video latency?

Glass-to-glass latency refers to the time duration of the entire chain of the video pipeline. From the moment a light source hits the CMOS sensor on the camera mounted on the vehicle to the final rendering of the image on the computer screen of the teledriver. All components in this chain (camera frame capture, ISP post-processing, encoding, network transmission, decoding, screen rendering) add a certain amount of delay to the feedback for the teledriver.

Depending how vertically integrated you are or how good instrumentation of the video pipeline you have, you might be forced to look at your entire system as a blackbox in order to measure with high accuracy glass-to-glass latency and include all contributors which affect the final latency number.

How not to measure glass-to-glass video latency

Like with any other measurement methodologies, there are more and less precise ways to do this. The usual method that many use is putting a high precision digital stopwatch in front of the camera and capturing a photo while the stopwatch is being rendered on the screen. This is quite a manual effort with low precision (due to the refresh rate of the stopwatch screen) and does not provide you with a good understanding of the glass-to-glass latency value distribution over time. Some other less scientific methods involve snapping your fingers in front of the camera and looking at the rendered image on the screen in order to have a subjective feeling on the latency of the system.

How to do it properly?

At Vay, we recognized quite early in the project the importance of having the high precision and reliable tool for measuring glass-to-glass latency. The glass-to-glass latency measuring tool (which is based on this paper) became important in evaluating and comparing different camera models and hardware compute platforms. Given that we found it quite useful in our context, we decided to open source it. We hope you will find it useful as well for understanding your end-to-end system performance and consider it over more expensive alternatives.

Basic principle behind the tool is to centralize the emitting of the light source (LED) on the camera lens and its detection (phototransistor) on the computer screen. The centralization approach eliminates the need for time synchronization which increases the precision of this method. Once that is in place, we are able to calculate the time delta between when the light source is activated (LED triggering is usually measured in microseconds) and when it is detected. Another benefit of this approach is that you are able to execute multiple measurements over time in order to get the latency distribution over time.

In our repo, you will find an affordable setup which is based on Arduino Uno and related sensors (LED and phototransistor). By running a simple Python script, a measurement test will be kicked off for a predefined number of cycles. The end result is a chart that shows the glass-to-glass video latency distribution.

Btw, we are hiring!

In case you are excited about working on a deep tech project which includes several engineering disciplines (robotics, perception, video, control, data science embedded, cloud, mobile app development, etc.) and would like to contribute to Vay’s new mobility service concept, do check out Vay’s Career page. We are Berlin based but we are offering remote work options across different roles.

Related Stories

Vay selected to join the EIC Scaling Club network

Exciting News: Vay is Now Live in Las Vegas!

Our teledriving-first approach: How we build teledrive technology around safety and the human driver