How we wrote chicken egg counter on a Raspberry PI

How it started Besides my main work on Upwork I quite often pick different projets. So I found a project, where I had to write a program for recognizing chicken eggs on a factory stream line. Customer wanted to install the application on computer with web camera, put this camera at a top of stream line and the application had to calculate eggs and send them to the DB. He also wanted to run this program on a cheap computer. The quality of the network in the factory isn’t stable, so the program had to be resilient to outstand the network issues. There was enough challenges for me, so I decided to participate on this project. The biggest challenge here was that I had no serious experience with OpenCV and image recognition, so I wanted to test myself if I can deep dive into unknown field and return with successful result. Customer wanted to have 99% of recognition. This whole post will be a story how this application was designed, how it was written and what problems did I faced during the development. I will try to explain each architecture decision, from the beginning and to the end of the...

What I learned from AWS Lambda

For the past 1 month, I had a chance to work with AWS Lambda. During the period of work with Lambda, I collected a lot of thoughts about this technology and would like to share them with you. Getting started So if you don’t know anything about AWS, I recommend starting with official docs: Amazon has a very rich documentation which will explain all the details about Lambda. If you don’t want to read the whole doc, then Lambda is a technology which allows you to deploy your code in a so-called Lambda functions - a containers somewhere inside AWS infrastructure. This gives a lot of benefits: you pay money only when you start invoking Lambda. The pricing for it is relatively low, as usual, AWS has a free tier which includes 1M free requests per month and 400,000 GB-seconds of compute time per month. The free tier description is a big confusing, I recommend using this table: Memory (MB) Free tier seconds per month Price per 100ms ($) 128 3,200,000 0.000000208 … … … 512 800,000 0.000000834 … … … 1024 400,000 0.000001667 … … … 2048 200,000 0.000003334 … … … 3008 136,170 0.000004897 Basically, for each particular...

Are you sure microservices architecture is for you?

Today is the starting of the fourth year since I began my journey with microservices. I started with a theoretical knowledge about this architecture and now I ended up with a more deep and practical experience. While I still believe I can find news problems in microservices, I prepared an article with a list of problems which I had a chance to face in my work. You will read short stories which I faced during my work. If you don’t agree with them and think that they could be fixed and identified earlier - that’s okay, I believe that you can’t find a microservices structure with identical problems - every organization has its own path and its own problems, thus, things that failed in one microservices architecture, could be omitted in another. Isolated messaging layer This story is about the messaging layer. You know the story, every cool microservices architecture has to have its own messaging layer: the idea is that you have an asynchronous way of communication between your services. I spent some time explaining this in Microservices interaction at scale using Apache Kafka article. So, the perfect scenario assumes that you have a bunch of microservices and some...

Kafka Consumer memory usage

I’m working with Kafka for more than 2 years and I wasn’t sure if Kafka Consumer eats more RAM memory when it has more partitions. I couldn’t find any useful information on the internet, so I decided to measure everything by myself. Inputs I started with 1 broker, since I am interested in actual memory consumption for 1 and 1000 partition topics. I know, lauching Kafka in a cluster can differs, because we have replication processes, acknowledgments, and other cluster things, but let’s skip it for now. Two basic commands for launching Kafka single node cluster: bin/zookeeper-server-start.sh config/zookeeper.properties bin/kafka-server-start.sh config/server.properties I created two topics, topic1, with 1 partition, and topic2, with 1000 partitions. I believe, the difference between partitions is enough for understanding memory consumption. bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic topic1 bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1000 --topic topic2 It’s good that Kafka provides us with kafka-producer-perf-test.sh, a performance script, which let us load test Kafka. bin/kafka-producer-perf-test.sh --topic topic1 --num-records 99999999999999 --throughput 1 --producer-props bootstrap.servers=localhost:9092 key.serializer=org.apache.kafka.common.serialization.StringSerializer value.serializer=org.apache.kafka.common.serialization.StringSerializer --record-size 100 So, I consequently launched load tests to insert data into two topics with a throughput of 1, 200, 500 and 1000 messages/second. I collected all...

Key things you should know about being a freelancer

Hi, my name is Ivan Ursul and I am a freelance engineer since 2015. It’s been a while since I started my career as an independent freelancer. I started it as an engineer in Upwork, in one of their teams, where I was involved first in reporting backend service, then in the time-tracker pipeline, which is served as a backend for Upwork Tracker Application(UTA) client. Today I continue my work with Upwork, but I am also actively working with other customers, who are very different. I’ve successfully completed 22 projects since the very beginning. That’s why I decided to write an article about different aspects of everyday life of a freelance engineer. You may agree or disagree, anyway, I encourage you to leave your thoughts under this article. This article will be grounded on Upwork platform, I haven’t used other platforms, but I am quite sure the approach is the same. Learn your customer You will have to find out the common things about your clients. Are all of them technical? Do you prefer to work with non-technical people? What is your industry domain? These are the questions you should have answers to. After you realize what combines your customers...