Latest from the Blog
Different ways of creating delta table in Databricks
Today we will learn Different ways of creating delta table in Databricks. We will check how the tables can be created using the existing apache spark code and also spark sql. create managed delta table using SQL In a managed table, databricks maintains both the data and metadata of the table. Which means if you…
Spark SQL Count Function
Spark SQL has count function which is used to count the number of rows of a Dataframe or table. We can also count for specific rows. People who having exposure to SQL should already be familiar with this as the implementation is same. Let’s see the syntax and example. But before that lets create a…
Spark Escape Double Quotes in Input File
Here we will see how Spark Escape Double Quotes in Input File. Ideally having double quotes in a column in file is not an issue. But we face issue when the content inside the double quotes also have double quotes along with file separator. Let’s see an example for this. Below is the data we…
Get new content delivered directly to your inbox.