Pyspark setup for IntelliJ IDEA
on 2021-01-24
Simple configuration of a new Python IntelliJ IDEA project with working pyspark. I was inspired by "Pyspark on IntelliJ" blog post by Gaurav M Shah, I just removed all the parts about deep learning libraries. I assume that you have a working IntelliJ IDEA IDE with Python plugin installed, and Python 3 installed on your machine. We will create a Python project in IntelliJ IDEA, change its Python SDK to a virtualenv based Python SDK, add Pyspark dependency to this VirtualEnv, install Pyspark in this VirtualEnv and finally test it using a small Pyspark hello world.
Pyspark gotchas for Scala Spark developers
on 2021-01-22
Apache Spark is developed in Scala. However Python API is more and more popular as Python is becoming the main language of Data Science. Although Python and Scala APIs are very close, there are some differences that can prevent a developer used to one API to smoothly use the other. This article lists those small differences, from the point of view of a Scala Spark developer wanting to use PySpark.
Spark custom aggregator behavior on ordered window with duplicates
on 2020-12-06
User-defined aggregated functions are a powerful tool in Spark: you can avoid a lot of useless computation by crafting aggregated functions that does exactly what you want. However, sometimes their behavior can be surprising. For instance, be careful when using a custom aggregator over a windows ordered by a column that contains duplicate values: buffer is not flushed at each line but only when the value in ordering column changes.
Read more of Spark custom aggregator behavior on ordered window with duplicates
Option versus nullable: which type spark deserializes faster
on 2020-11-12
Recently, I was wondering about Spark’s deserialization performance. Especially this question: when you have a nullable column in a dataframe, is it better to deserialize it to an option or to a nullable type ? Let’s answer this question in this blog post. The benchmark To answer this question, I define the following benchmark. I create simple input data, read it with three Spark applications that select a column, replace its null value with a default value, and write the result to parquet.
Read more of Option versus nullable: which type spark deserializes faster
Reading parquets with different schemas in Spark
on 2020-10-25
Yesterday, I ran into a behavior of Spark’s DataFrameReader when reading Parquet data that can be misleading. If we have several parquet files in a parquet data directory having different schemas, and if we don’t provide any schema or if we don’t use the option mergeSchema, the inferred schema depends on the order of the parquet files in the data directory. The setup I am reading data stored in Parquet format.
Read more of Reading parquets with different schemas in Spark
Install Hugo static website with nginx and let’s encrypt certificate using ansible
on 2020-07-19
In this article, I will present you how I configured the deployment of my blog that use the static site generator Hugo To do so, I used the following tools: Ansible (version 2.9.10) for configuration managment system Nginx for webserver Let’s encrypt and certbot for TLS/SSL certificates This article will go thought all the ansible tasks I had to put in place in order to install nginx, configure my blog virtualhost, add TLS/SSL encryption to the blog, and build the blog with Hugo
Read more of Install Hugo static website with nginx and let's encrypt certificate using ansible
Install JDK 8 on MacOS Without Admin Rights
on 2020-07-04
Many corporation are still developing applications using the 8th version of Java. And many corporations have strict security rules forbidding employees to have the administrator rights on their machine. However, not so many corporations use macbook pro as employee’s machine. The advantages of macbook pro for a developer is that it is rather easy to install software without administrator’s right. You just install Homebrew in your home directory and you can install lots of software without asking anyone permission.