Hi I'm

Denys Golotiuk

Software Engineering Expert

About Me

Welcome 🇺🇦 I’m Denys, a software engineering expert based in Kyiv, Ukraine. Masters in engineering and MBA. I have more than 18 years of experience in building efficient technical teams and delivering software products. Focused on data-intensive infrastructures, heavy loads, and analytical environments. Opensource stack. Technology agnostic.

  • Software Engineering
  • Data Processing
  • Analytical Systems
  • Machine Learning
  • Technical Leadership
  • Projects Delivery
Download CV

What I do

Software Engineering

I create and deliver web (browser-based) software. Typically I deal with control systems (so called, admin panels), integrations and automations of all kinds.

Analytical Infrastructures

I create systems that transforms lots of randomly structured or unstructured data into accessible analytical products, consumed by analysts, data-engineers, managers and APIs.

Tech Management

I can lead technical teams and make sure results are delivered. My approach is to utilize people strong sides keeping focus on targets and adapt process to keep up with feedback loop.

Value from ML

While ML is overhyped nowdays, it still can serve as an efficient tool for automation and data processing way beyond human capabilities. I can make LLMs do the right job.

Performance Improvement

Given loads of data we have to deal with, system performance is either overengineered or being sacrifised. Still, performance engineering is something to deal with when time comes.

Security & Availability

In the complex process of data flow between storages and humans security and availability has to be designed and implemented. Things like GDPR are meant to improve our systems.

Costs Reduction

While infrastructure cost is not a big concern on startup stages, it can drain funds really fast during the growing stage. The right balance in cloud solutions and baremetal can help.

Data Pipelining

Before bringing some value, data has to be collected, cleansed and processed. Building scalable and flexible data feeds and pipelines is critical for further valuable insights.

Teams training

Though training comes as a part of my management approach, I'm an occasional blogger and like to share my knowledge. I practice training sessions tailored for specific cases.

Technologies

  • OLTP
  • Postgres
  • MySQL
  • Redis
  • Mongo
  • OLAP
  • Redshift
  • Clickhouse
  • Kafka
  • Annoy
  • Text
  • Elastic
  • Sphinx
  • Manticore
  • Data & ML
  • Python
  • Ollama
  • Tesseract
  • Automation
  • NodeJS
  • Chrome RD
  • Android DB
  • Web
  • Nginx
  • PHP
  • Javascript
  • Ubuntu
  • CI & CD
  • Git
  • Github
  • Docker
  • Ansible
  • AWS

Areas of expertise

  • Creating and implementing efficient architectures for data-intensive applications,
  • Developing and delivering secure and scalable data processing and analytics solutions,
  • Building efficient software development process that generates fast results,
  • Assemling and scaling technical teams, including assessment and growth management,
  • Leading and training engineers to arm them with the right skills and tools,
  • Helping go through crisis situations using both immediate solutions and implementing strategic changes.

Things I've delivered

Full-time CTO Gen.tech

2007-2012

I've been in charge of the technical side of online mass-market products. It was an early stage of going to market, so we were actively researching markets to find the best strategy. The biggest challange was to build a process that allowed us to quickly adapt our products to market needs, while maintaining high availability and performance at scale.

  • Designed scalable and stable system architecture that handled tens of millions of requests per day,
  • Assembled and scaled technical team from 1 to 25 people,
  • Set up dynamic process based on short iterations and quick delivery,
  • Successfully launched 8 products in 6 European and Asian markets.

Partner, Analytical Infrastructure Gen.tech

2012-2014

During our growth stage, we required a set of solutions to collect and process analytical data, so we could better adapt our products to user needs. Two challenges here were to collect hundreds of millions data points daily and process this data in real-time by both automated and human consumers. I was also in charge of grounding a team of data engineers and analysts to crunch the numbers and drive insights.

  • Designed and implemented data collection and pipelining components,
  • Built scalable data warehouse for real-time analytics,
  • Assembled a team of data analysts and engineers, set up the process of continuous data quality improvement and insights delivery,
  • At the end of the day, this analytical infrastructure yielded 10-fold increase in customer acquisition and retention.

VP of engineering .IO

2014-2022

I've taken on as a VP of engineering at the analytical product for online content platforms, in US and EU markets. The challenge was to design a system that could handle billions of data points daily since our customers typically had 10m+ readers a day. Furthermore, we provided a near-zero latency data delivery, which allowed our customers to make decisions in a matter of seconds based on actual figures of user engagement and interests.

  • Designed and implemented data collection, processing, and presentation architecture,
  • Assembled and trained a prominent team of technical experts, that scaled the system to process billions of data points per day,
  • Led the team through security assessments to comply with GDPR,
  • Delivered multiple customized analytical subproducts for enterprise customers, including the largest Eastern Europe marketplaces.

Engineering Expert, Internal Data Fusion Platform Military

2022-2023

Since the beginning of full-scale invasion, I've focused on driving technical innovations in the military field. The main challenge was to bring modern technologies into military data collection and processing, including ML, to make huge amounts of incoming data drive insights within a short time window. Another challenge here was to set up local ML infrastructure; deploy and fine-tune local models to allow fast but accurate results during the inference stage.

  • Designed, implemented, and integrated data processing solution that handles over 150 Billion records,
  • Scaled the system to process Terabytes of daily streams autonomously based on local ML and linear algorithms,
  • Provided secure multi-platform access for end-users, including researchers and analysts,
  • Integrated tens of data feeds, mostly proprietary, for automatic data ingestion.

Head of Engineering, Information Defense Platform Military

2023-2025

In 2023, I started working on a system aimed at securing the information field at the level of entire countries I've assembled a team of technical and data experts, set up the processes and delivered first results within half a year. The next challenge was to scale the system further which required not only data collection and processing, but also hardware automation and content analysis on the fly. The technology is heavily based on generative models, including language and multi-modal ones.

  • Created content feed collection and analysis system, that processes 1m+ content items daily,
  • Implemented automated hardware-based human behaviour emulation platform,
  • Designed and developed a semantic-based analysis system for detecting information distortion,
  • The system is on a service already, scaling beyond hundreds of millions of processed content items.

Pricing Table

Full-time contracts

I am available for full time contracts

$10k monthly
  • 1 year contract minimum
  • Strategic KPIs
  • Continuous reporting
  • Relocation is possible
Contact

Fixed Price Projects

Project-based contracts for shorter periods

$30k+ / project
  • 3 month contract minimum
  • Project goals
  • Iterative reporting
  • Remote is possible
Contact

Hourly Based Contracts

Urgent and periodic matters

$75 / hour
  • Consulting
  • Urgent help
  • Support & Audit
  • Assessment & Planning
Contact

Some of My Pubs

What is actually a neural network?

The simple idea behind machine learning is to describe given data with a math function. Why? Because math functions allow us to get outputs for unknown inputs.

Read More

Working with Time Series Data in ClickHouse

Many datasets are collected over time to analyze and discover meaningful trends. Each data point usually has a time assigned when we collect logs or business events.

Read More

Data cleansing and preparation for analysis with Python and Pandas

Any data usually (always?) contain errors. In order to do accurate analysis and build efficient ML models, data needs to be cleansed.

Read More