Hi I'm

Denys Golotiuk

Software Engineering Expert

About Me

Welcome 🇺🇦 I’m Denys, a software engineering expert based in Kyiv, Ukraine. Masters in engineering and MBA. I have more than 18 years of experience in building efficient technical teams and delivering software products. Focused on data-intensive infrastructures, heavy loads, and analytical environments. Opensource stack. Technology agnostic.

  • Software Engineering
  • Data Processing
  • Analytical Systems
  • Machine Learning
  • Technical Leadership
  • Projects Delivery
Open CV

What I do

Software Engineering

I create and deliver web (browser-based) software. Typically I deal with control systems (so called, admin panels), integrations and automations of all kinds.

Analytical Infrastructures

I create systems that transforms lots of randomly structured or unstructured data into accessible analytical products, consumed by analysts, data-engineers, managers and APIs.

Tech Management

I can lead technical teams and make sure results are delivered. My approach is to utilize people strong sides keeping focus on targets and adapt process to keep up with feedback loop.

Value from ML

While ML is overhyped nowdays, it still can serve as an efficient tool for automation and data processing way beyond human capabilities. I can make LLMs do the right job.

Performance Improvement

Given loads of data we have to deal with, system performance is either overengineered or being sacrifised. Still, performance engineering is something to deal with when time comes.

Security & Availability

In the complex process of data flow between storages and humans security and availability has to be designed and implemented. Things like GDPR are meant to improve our systems.

Costs Reduction

While infrastructure cost is not a big concern on startup stages, it can drain funds really fast during the growing stage. The right balance in cloud solutions and baremetal can help.

Data Pipelining

Before bringing some value, data has to be collected, cleansed and processed. Building scalable and flexible data feeds and pipelines is critical for further valuable insights.

Teams training

Though training comes as a part of my management approach, I'm an occasional blogger and like to share my knowledge. I practice training sessions tailored for specific cases.

Technologies

  • OLTP
  • Postgres
  • MySQL
  • Redis
  • Mongo
  • OLAP
  • Redshift
  • Clickhouse
  • Kafka
  • Annoy
  • Text
  • Elastic
  • Sphinx
  • Manticore
  • Data & ML
  • Python
  • Ollama
  • Tesseract
  • Automation
  • NodeJS
  • Chrome RD
  • Android DB
  • Web
  • Nginx
  • PHP
  • Javascript
  • Ubuntu
  • CI & CD
  • Git
  • Github
  • Docker
  • Ansible
  • AWS

Areas of expertise

  • Creating and implementing efficient architectures for data-intensive applications,
  • Developing and delivering secure and scalable data processing and analytics solutions,
  • Building efficient software development process that generates fast results,
  • Assemling and scaling technical teams, including assessment and growth management,
  • Leading and training engineers to arm them with the right skills and tools,
  • Helping go through crisis situations using both immediate solutions and implementing strategic changes.

Things I've delivered

Full-time CTO Gen.tech

2007-2012

I've been in charge of the technical side of online mass-market products. It was an early stage of going to market, so we were actively researching markets to find the best strategy. The biggest challange was to build a process that allowed us to quickly adapt our products to market needs, while maintaining high availability and performance at scale.

  • Designed scalable and stable system architecture that handled tens of millions of requests per day,
  • Assembled and scaled technical team from 1 to 25 people,
  • Set up dynamic process based on short iterations and quick delivery,
  • Successfully launched 8 products in 6 European and Asian markets.

Partner, Analytical Infrastructure Gen.tech

2012-2014

During our growth stage, we required a set of solutions to collect and process analytical data, so we could better adapt our products to user needs. Two challenges here were to collect hundreds of millions data points daily and process this data in real-time by both automated and human consumers. I was also in charge of grounding a team of data engineers and analysts to crunch the numbers and drive insights.

  • Designed and implemented data collection and pipelining components,
  • Built scalable data warehouse for real-time analytics,
  • Assembled a team of data analysts and engineers, set up the process of continuous data quality improvement and insights delivery,
  • At the end of the day, this analytical infrastructure yielded 10-fold increase in customer acquisition and retention.

VP of engineering .IO

2014-2022

I've taken on as a VP of engineering at the analytical product for online content platforms, in US and EU markets. The challenge was to design a system that could handle billions of data points daily since our customers typically had 10m+ readers a day. Furthermore, we provided a near-zero latency data delivery, which allowed our customers to make decisions in a matter of seconds based on actual figures of user engagement and interests.

  • Designed and implemented data collection, processing, and presentation architecture,
  • Assembled and trained a prominent team of technical experts, that scaled the system to process billions of data points per day,
  • Led the team through security assessments to comply with GDPR,
  • Delivered multiple customized analytical subproducts for enterprise customers, including the largest Eastern Europe marketplaces.

Some of My Articles

Building Efficient Software Engineering Teams

Sharing my experience and findings on building efficient software engineering teams that deliver things. How to get real and maintain focus on shipping results instead of excuses.

Read More

What is Actually a Neural Network?

The simple idea behind machine learning is to describe given data with a math function. Why? Because math functions allow us to get outputs for unknown inputs.

Read More

Working with Time Series Data in ClickHouse

Many datasets are collected over time to analyze and discover meaningful trends. Each data point usually has a time assigned when we collect logs or business events.

Read More