Thu.May 16, 2024

article thumbnail

Mind the map: a new design for the London Underground map

ArcGIS

A modern take on the London tube map with updated accessible colours, a re-classification of lines by type, and line symbols scaled by frequency

Designing 135
article thumbnail

Unlocking the Potential of Private Data Sharing using Databricks Private Exchanges

databricks

We are thrilled to announce an exciting new feature on the Databricks Marketplace that simplifies the process of setting up private exchanges for.

Data 119
Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Exploring Natural Sorting in Python

KDnuggets

Learn how to perform natural sorting in Python using the natsort Python library.

Python 115
article thumbnail

Multiresolution Object Detection with Text SAM

ArcGIS

This blog post will walk you through the process of running multi resolution deep learning over a range of cell sizes.

article thumbnail

Apache Airflow® 101 Essential Tips for Beginners

Apache Airflow® is the open-source standard to manage workflows as code. It is a versatile tool used in companies across the world from agile startups to tech giants to flagship enterprises across all industries. Due to its widespread adoption, Airflow knowledge is paramount to success in the field of data engineering.

article thumbnail

Feature Engineering for Beginners

KDnuggets

This guide introduces some key techniques in the feature engineering process and provides practical examples in Python.

More Trending

article thumbnail

Thomas Miller, PhD, explores Northwestern University’s Online Graduate Programs in Data Science

KDnuggets

Join Thomas Miller, for an online information session to learn more about Northwestern online graduate programs in Data Science.

article thumbnail

Semiconductors on the Data Intelligence Platform

databricks

In the semiconductor industry, research and development tasks, manufacturing processes, and enterprise planning systems produce an array of data artifacts that can be fused to create an intelligent semiconductor enterprise. Through intelligent data use, an intelligent semiconductor enterprise accelerates time to market, increases manufacturing yield, and enhances product reliability.

article thumbnail

How Snowflake and Merit Helped Provide Over 120,000 Students with Access to Education Funding 

Snowflake

Snowflake joined forces with Merit to provide an identity verification platform and a set of program delivery services that help run large-scale government programs in areas such as licensing regulations, workforce development, emergency management, and educational grants and scholarships. By augmenting the power of the Snowflake Data Cloud with Merit’s platform, Snowflake is enabling government and education entities to share data and empower secure, modern and robust collaboration.

article thumbnail

Contributing to Apache Kafka®: How to Write a KIP

Confluent

Learn how to contribute to open source Apache Kafka by writing Kafka Improvement Proposals (KIPs) that solve problems and add features! Read on for real examples.

Kafka 88
article thumbnail

Apache Airflow® Best Practices: DAG Writing

Speaker: Tamara Fingerlin, Developer Advocate

In this new webinar, Tamara Fingerlin, Developer Advocate, will walk you through many Airflow best practices and advanced features that can help you make your pipelines more manageable, adaptive, and robust. She'll focus on how to write best-in-class Airflow DAGs using the latest Airflow features like dynamic task mapping and data-driven scheduling!

article thumbnail

Trustworthy AI, Powered by Trusted Data

Precisely

Key takeaways: The success of your AI initiatives hinges on the integrity of your data. Ensure your data is accurate, consistent, and contextualized to enable trustworthy AI systems that avoid biases, improve accuracy and reliability, and boost contextual relevance and nuance. Adopt strategic practices in data integration, quality management, governance, spatial analytics, and data enrichment.

article thumbnail

Dynamic Table and Alerts for Customer Purchase Notifications

Cloudyard

Read Time: 1 Minute, 33 Second Snowflake Dynamic Table and Alerts: This use case addresses automating customer accessory purchase monitoring in Snowflake to provide the marketing team with timely insights for personalized promotions. Imagine you’re a Data Engineer/Data Governance developer for an online retail store. You ensure the marketing team receives insights on customer accessory purchases to offer targeted promotions.

Retail 52
article thumbnail

NPM Install Packages: Install Global and Local NPM Packages

Knowledge Hut

It doesn't matter whether you're working entirely in Node.js or using it as a front-end package management or build tool; npm is an essential part of current web development processes in any language or platform. For a newbie, it may be tough to grasp the fundamental notions of npm as a tool. We spent a lot of time figuring out seemingly little nuances that others would take for granted.

article thumbnail

How Generative AI is Transforming Customer Experiences in Real Time

Striim

The ability to quickly understand and respond to customer demands is critical for staying ahead of the competition. Generative AI (GenAI) is quickly reshaping customer experiences across various sectors. It enables businesses to engage with their clients in real time, providing an unprecedented level of personalization and responsiveness. This innovative approach not only boosts customer satisfaction but also cultivates loyalty and encourages sustained interaction.

article thumbnail

Apache Airflow® Crash Course: From 0 to Running your Pipeline in the Cloud

With over 30 million monthly downloads, Apache Airflow is the tool of choice for programmatically authoring, scheduling, and monitoring data pipelines. Airflow enables you to define workflows as Python code, allowing for dynamic and scalable pipelines suitable to any use case from ETL/ELT to running ML/AI operations in production. This introductory tutorial provides a crash course for writing and deploying your first Airflow pipeline.

article thumbnail

Just Launched: Data Products

Monte Carlo

At what level does it make sense to deploy your data monitoring coverage? For most traditional data quality tools the answer has been by table. The typical workflow includes scanning or profiling a table, and then applying a number of suggested checks or rules. One by one by one by… That’s awfully tedious for any environment with thousands, hundreds, or even just dozens of tables.

article thumbnail

New post test

Precisely

The post New post test appeared first on Precisely.