luminousmen
I P

CI/CD

Why do we need it? During the work process developers frequently need to update their services and deploy them on the servers. When amount of projects is small it is not an issue, there are no problems because releases  and deploiment process are rare. Tests are running manually. But when the time comes, number of projects and tasks increases and execution of the same tasks takes more time. Let's look at typical process of feature implementation for majority of projects: Get task from backlog / from team lead Create new branch from git Implement feature Run tests Create merge request and wait for code review Merge branch to master Build application Deploy new build This process is repeating for EVERY task. If it takes 10 days for feature to be implemented, debugged and tested and 1...

Python Static Analysis Tools

Development teams are under pressure. Releases needed to be delivered on time. Coding and quality standards need to be met. And mistakes are not an option. That’s why development teams are using static analysis. The main work of static code analysis tools is to analyze an application’s compiled code or source code analysis so that one can easily identify the vulnerabilities without executing the program. Why use static analysis? Provides insight into code without executing it Executes quickly relative to dynamic analysis Can automate maintaining code quality Can automate finding bugs early (not all of them though) Can automate finding security issues early You're already using it (if you're using any IDE it already has static analysers inside, Pycharm uses pep8 for example) What types...

Azure Blob Storage with Pyspark

Azure Blob Storage is a service for storing large amounts of unstructured object data, such as text or binary data. You can use Blob Storage to expose data publicly to the world or to store application data privately. In this post, I'll explain how to access Azure Blob Storage using spark framework on Python. So, imagine that you already have an Azure storage account, you have data there. Azure blob requires to install additional libraries for accessing data from it, because it uses wasb/wasbs protocol, not standard tcp/http/etc. The built jar files, named hadoop-azure.jar and azure-storage.jar need to be add to spark-submit when you submitting a job. On the application level, first of all as always in spark applications, you need to grab a spark session: session =...

Who is a team lead?

On my current project, I'm playing a role of a Team Lead. It is one of those roles, whose responsibilities many people understand differently and often confuse with a Senior role. In this post, I want to clarify this question and describe how I see the responsibilities of a Team Lead. Let's start with the Senior role. Senior role in a team is closer to the technical side of the project - they are some kind of a gurus who deeply understand the project (its development stack, its architecture), they help to solve arising technical issues, teach newcomers, share their experience and build high-level solutions (sometimes they try on the role of a Software Architect). Team Lead is a role which implies not only a deep understanding of the technical side of the project but also team...

I started (again) my blog

I started my blog again. I'd highlight a few reasons why I need it: Learn to write well. Back in my childhood, I loved to put my thoughts on paper. As a child, I wrote stories, even poems, but over time, I lost my skills to express my thoughts well. I hope this blog will help me to put the words and phrases in the right order so that my thoughts could easily reach other people's heads. Improve my English. I'm not a native speaker and my English level is far from ideal. I think writing articles about what I come across and things that I care about seems to be a good way to strengthen my writing skills in English. Cheetsheet. I will be honest, what I write here will be primarily for me. I will share with myself the recipes and thoughts of how I thought before. This will at least help me...