Sat.May 25, 2024 - Fri.May 31, 2024

article thumbnail

Building cost effective data pipelines with Python & DuckDB

Start Data Engineering

1. Introduction 2. Project demo 3. TL;DR 4. Building efficient data pipelines with DuckDB 4.1. Use DuckDB to process data, not for multiple users to access data 4.2. Cost calculation: DuckDB + Ephemeral VMs = dirt cheap data processing 4.3. Processing data less than 100GB? Use DuckDB 4.4. Distributed systems are scalable, resilient to failures, & designed for high availability 4.5.

article thumbnail

Building Data Platforms (from scratch)

Confessions of a Data Guy

Of all the duties that Data Engineers take on during the regular humdrum of business and work, it’s usually filled with the same old, same old. Build new pipeline, update pipeline, new data model, fix bug, etc, etc. It’s never-ending. It’s a constant stream of data, new and old, spilling into our Data Warehouses and […] The post Building Data Platforms (from scratch) appeared first on Confessions of a Data Guy.

Building 184
Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

5 Free MIT Courses to Learn Math for Data Science

KDnuggets

Learning math is super important for data science. Check out these free courses from MIT to learn linear algebra, statistics, and more.

article thumbnail

Introducing the Robinhood Crypto Trading API

Robinhood

Robinhood Crypto customers in the United States can now use our API to view crypto market data, manage portfolios and account information, and place crypto orders programmatically Today, we are excited to announce the Robinhood Crypto trading API , ushering in a new era of convenience, efficiency, and strategy for our most seasoned crypto traders. Robinhood Crypto customers in the United States can use our new trading API to set up advanced and automated trading strategies that allow them to st

Insurance 142
article thumbnail

Apache Airflow® 101 Essential Tips for Beginners

Apache Airflow® is the open-source standard to manage workflows as code. It is a versatile tool used in companies across the world from agile startups to tech giants to flagship enterprises across all industries. Due to its widespread adoption, Airflow knowledge is paramount to success in the field of data engineering.

article thumbnail

Introducing Salesforce BYOM for Databricks

databricks

Salesforce and Databricks are excited to announce an expanded strategic partnership that delivers a powerful new integration - Salesforce Bring Your Own Model.

138
138

More Trending

article thumbnail

5 Free Python Courses for Data Science Beginners

KDnuggets

Are you a data science beginner looking to learn Python? Start learning today with these 5 free courses.

article thumbnail

How To Data Model – Real Life Examples Of How Companies Model Their Data

Seattle Data Guy

How companies data model varies widely. They might say they use Kimball dimensional modeling. However, when you look in their data warehouse the only part you recognize is the word fact and dim. Over the past near decade, I have worked for and with different companies that have used various methods to capture this data.… Read more The post How To Data Model – Real Life Examples Of How Companies Model Their Data appeared first on Seattle Data Guy.

article thumbnail

Infoshare 2024: Stream processing fallacies, part 1

Waitingforcode

Last week I was speaking in Gdansk on the DataMass track at Infoshare. As it often happens, the talk time slot impacted what I wanted to share but maybe it's for good. Otherwise, you wouldn't read stream processing fallacies!

Process 130
article thumbnail

What’s New from the Geodatabase Team in ArcGIS Pro 3.3

ArcGIS

Here's everything new in ArcGIS Pro 3.3 from the Geodatabase Team.

Data 135
article thumbnail

Apache Airflow® Best Practices: DAG Writing

Speaker: Tamara Fingerlin, Developer Advocate

In this new webinar, Tamara Fingerlin, Developer Advocate, will walk you through many Airflow best practices and advanced features that can help you make your pipelines more manageable, adaptive, and robust. She'll focus on how to write best-in-class Airflow DAGs using the latest Airflow features like dynamic task mapping and data-driven scheduling!

article thumbnail

Top SQL Queries for Data Scientists

KDnuggets

SQL seems like a data science underdog compared to Python and R. However, it’s far from it. I’ll show you here how you can use it as a data scientist.

SQL 146
article thumbnail

Why Data Analysts And Engineers Make Great Consultants

Seattle Data Guy

Many data engineers and analysts don’t realize how valuable the knowledge they have is. They’ve spent hours upon hours learning SQL, Python, how to properly analyze data, build data warehouses, and understand the differences between eight different ETL solutions. Even what they might think is basic knowledge could be worth $10,000 to $100,000+ for a… Read more The post Why Data Analysts And Engineers Make Great Consultants appeared first on Seattle Data Guy.

article thumbnail

Data Migration Strategies For Large Scale Systems

Data Engineering Podcast

Summary Any software system that survives long enough will require some form of migration or evolution. When that system is responsible for the data layer the process becomes more challenging. Sriram Panyam has been involved in several projects that required migration of large volumes of data in high traffic environments. In this episode he shares some of the valuable lessons that he learned about how to make those projects successful.

Systems 130
article thumbnail

Introduction to the Export Attachments geoprocessing tool

ArcGIS

Learn about the new Export Attachments geoprocessing tool in ArcGIS Pro 3.3 and how it simplifies the process of exporting attachments.

Process 116
article thumbnail

Optimizing The Modern Developer Experience with Coder

Many software teams have migrated their testing and production workloads to the cloud, yet development environments often remain tied to outdated local setups, limiting efficiency and growth. This is where Coder comes in. In our 101 Coder webinar, you’ll explore how cloud-based development environments can unlock new levels of productivity. Discover how to transition from local setups to a secure, cloud-powered ecosystem with ease.

article thumbnail

Exploring Google’s Latest AI Tools: A Beginner’s Guide

KDnuggets

Check out this beginner's guide to take advantage of Google’s AI tools.

144
144
article thumbnail

Snowflake Ventures Expands Investment in Sigma, Deepening Commitment to Bringing World-Class BI Directly into the AI Data Cloud

Snowflake

We’re excited to announce today that we’re reinforcing our commitment and deepening our partnership with Sigma with an expanded investment from Snowflake Ventures. Sigma is a leading business intelligence and analytics solution that makes it easy for employees to explore live data, create compelling visualizations and collaborate with colleagues. Sigma allows employees to break free of dashboards and build workflows, powered by write-back to Snowflake through their unique Input Tables capability

BI 112
article thumbnail

Social Impact Using Data and AI: Revealing the 2024 Finalists for the Data For Good Award

databricks

The annual Data Team Awards celebrate the critical contributions of data teams to various sectors, spotlighting their role in driving progress and positive.

Data 105
article thumbnail

Choose similar colors to map similar things

ArcGIS

Three videos about choosing colors in cartography.

Designing 109
article thumbnail

15 Modern Use Cases for Enterprise Business Intelligence

Large enterprises face unique challenges in optimizing their Business Intelligence (BI) output due to the sheer scale and complexity of their operations. Unlike smaller organizations, where basic BI features and simple dashboards might suffice, enterprises must manage vast amounts of data from diverse sources. What are the top modern BI use cases for enterprise businesses to help you get a leg up on the competition?

article thumbnail

Google Have Just Dropped a New Course: AI Essentials

KDnuggets

A course that helps career switchers and advancers harness the power of AI to transform the way they work.

137
137
article thumbnail

Snowflake Ventures Increases Investment in Hex, Deepening the Partnership for Collaborative Workspace Capabilities in the Data Cloud  

Snowflake

The AI Data Cloud unlocks the power of data for technical and non-technical users alike, including data analysts, data scientists, data engineers and business users. When employees can collaborate seamlessly to generate new insights, share findings and create efficient workflows, organizations can drive even more efficiency, unlocking value from their data, faster.

Cloud 101
article thumbnail

From Data to Destinations: How Skyscanner Optimizes Traveler Experiences with Databricks Unity Catalog

databricks

This blog is authored by Michael Ewins, Director of Engineering at Skyscanner At Skyscanner , we're more than just a flight search engine.

article thumbnail

ArcGIS Pro Virtualization Hardware and VM Profiles

ArcGIS

ArcGIS Pro virtualization server hardware and VM profiles for the best user experience.

104
104
article thumbnail

Apache Airflow® Crash Course: From 0 to Running your Pipeline in the Cloud

With over 30 million monthly downloads, Apache Airflow is the tool of choice for programmatically authoring, scheduling, and monitoring data pipelines. Airflow enables you to define workflows as Python code, allowing for dynamic and scalable pipelines suitable to any use case from ETL/ELT to running ML/AI operations in production. This introductory tutorial provides a crash course for writing and deploying your first Airflow pipeline.

article thumbnail

5 Python Best Practices for Data Science

KDnuggets

Level up your Python skills for data science with these by following these best practices.

article thumbnail

Retail Media’s Business Case for Data Clean Rooms Part 2: Commercial Models

Snowflake

In Part 1 of “Retail Media’s Business Case for Data Clean Rooms,” we discussed how to (1) assess your data assets and (2) define your data structures and permissions. Once you have a plan on paper, you can begin sizing the data clean room opportunity for your business. Step 3: Commercial Models to Unlock Revenue at Scale Modeling the business value comes down to two things: (1) What data are you making accessible; and (2) How many partners are you willing (and able) to engage?

Retail 100
article thumbnail

Delta Sharing and The Emergence of the Lakehouse Customer Data Platform (CDP)

databricks

Special thanks to Caleb Benningfield and Sam Malissa at Amperity for their valuable insights and contributions to this blog. Today, businesses face a.

Data 100
article thumbnail

Orchestrating a Dynamic Time-series Pipeline with Azure Data Factory and Databricks

Towards Data Science

Explore how to build, trigger and parameterize a time-series data pipeline in Azure, accompanied by a step-by-step tutorial Continue reading on Towards Data Science »

article thumbnail

Prepare Now: 2025s Must-Know Trends For Product And Data Leaders

Speaker: Jay Allardyce, Deepak Vittal, Terrence Sheflin, and Mahyar Ghasemali

As we look ahead to 2025, business intelligence and data analytics are set to play pivotal roles in shaping success. Organizations are already starting to face a host of transformative trends as the year comes to a close, including the integration of AI in data analytics, an increased emphasis on real-time data insights, and the growing importance of user experience in BI solutions.

article thumbnail

5 Best End-to-End Open Source MLOps Tools

KDnuggets

Explore free and open-source MLOps tools for enhanced data privacy and control over your models and code.

Coding 132
article thumbnail

Retail Media’s Business Case for Data Clean Rooms Part 1: Your Data Assets and Permissions

Snowflake

It’s hard to have a conversation in adtech today without hearing the words, “retail media.” The retail media wave is in full force, piquing the interest of any company with a strong, first-party relationship with consumers. Companies are now understanding the value of their data and how that data can power a new, high-margin media business. The two-sided network that exists between retailers and their brands turns into a flywheel for growth.

Retail 100
article thumbnail

Latest Computer Science Research Topics for 2024

Knowledge Hut

Everybody sees a dream—aspiring to become a doctor, astronaut, or anything that fits your imagination. If you were someone who had a keen interest in looking for answers and knowing the “why” behind things, you might be a good fit for research. Further, if this interest revolved around computers and tech, you would be an excellent computer researcher!