This site uses cookies to improve your experience. To help us insure we adhere to various privacy regulations, please select your country/region of residence. If you do not select a country, we will assume you are from the United States. Select your Cookie Settings or view our Privacy Policy and Terms of Use.
Cookie Settings
Cookies and similar technologies are used on this website for proper function of the website, for tracking performance analytics and for marketing purposes. We and some of our third-party providers may use cookie data for various purposes. Please review the cookie settings below and choose your preference.
Used for the proper function of the website
Used for monitoring website traffic and interactions
Cookie Settings
Cookies and similar technologies are used on this website for proper function of the website, for tracking performance analytics and for marketing purposes. We and some of our third-party providers may use cookie data for various purposes. Please review the cookie settings below and choose your preference.
Strictly Necessary: Used for the proper function of the website
Performance/Analytics: Used for monitoring website traffic and interactions
Usually you can identify the best Zalando customers among your colleagues by the amount of zalando boxes they collect under their desks :) Some creative colleagues got inspired at the sight of stockpiled parcels and saw more in them than just packages. They started the hackweek project “cardboard furniture”in which they are currently building stuff out of old cardboard boxes.
A good hadoop big data resume might not be enough to get you selected but a bad hadoop big data resume is enough for rejection.Many big data professionals consider writing big data hadoop resume as an exercise in psychological warfare. Are you one among them? Do you want to move your big data hadoop resume from the slush pile to the "YES" pile ,then you must follow some important guidelines to ensure that your hadoop big data resume does not land into the "NO" pile of CV's.This article aims to p
Five years of extremely fast growth and continuous success lie behind us. Always forward-focused, we hardly had time to look left or right. But for one week in late December, our Technology department made a full stop and let the year wind down with playful innovation and experimentation. Say “Hi” to Hack Week – one week for over 400 people to brainstorm and execute their very own ideas without limits across teams, functions and several office locations.
Hadoop is present in all the vertical industries today for leveraging big data analytics so that organizations can gain competitive advantage. With petabytes of data produced from transactions amassed on regular basis, several banking and financial institutions have already shifted to Hadoop. The ones who have not are making hadoop adoption in the enterprise as a priority in 2015 as they do not want to risk huge market share loss.
Apache Airflow® 3.0, the most anticipated Airflow release yet, officially launched this April. As the de facto standard for data orchestration, Airflow is trusted by over 77,000 organizations to power everything from advanced analytics to production AI and MLOps. With the 3.0 release, the top-requested features from the community were delivered, including a revamped UI for easier navigation, stronger security, and greater flexibility to run tasks anywhere at any time.
Pig and Hive are the two key components of the Hadoop ecosystem. What does pig hadoop or hive hadoop solve? Pig hadoop and Hive hadoop have a similar goal- they are tools that ease the complexity of writing complex java MapReduce programs. However, when to use Pig Latin and when to use HiveQL is the question most of the have developers have. Apache HIVE and Apache PIG components of the Hadoop ecosystem are briefed.
According to the Industry Analytics Report, hadoop professionals get 250% salary hike. Java developers have increased probability to get a strong salary hike when they shift to big data job roles. If you are a java developer, you might have already heard about the excitement revolving around big data hadoop. Most of your peers would have already made a career shift in big data hadoop to secure a consistent career path by gaining expertise in big data job skills.
Hadoop’s significance in data warehousing is progressing rapidly as a transitory platform for extract, transform, and load (ETL) processing. Mention about ETL and eyes glaze over Hadoop as a logical platform for data preparation and transformation as it allows them to manage huge volume, variety, and velocity of data flawlessly. Hadoop is extensively talked about as the best platform for ETL because it is considered an all-purpose staging area and landing zone for enterprise big data.
Hadoop’s significance in data warehousing is progressing rapidly as a transitory platform for extract, transform, and load (ETL) processing. Mention about ETL and eyes glaze over Hadoop as a logical platform for data preparation and transformation as it allows them to manage huge volume, variety, and velocity of data flawlessly. Hadoop is extensively talked about as the best platform for ETL because it is considered an all-purpose staging area and landing zone for enterprise big data.
One of the most visible and successful projects outside Zalando was a project at last Hack Week that initially seemed just like a nice gadget. As a result of this “KINECT Virtual Dressing (for Xbox)” project, users could try-on Zalando's products using the sensor bar, called KINECT, which is a widely known extension for the Microsoft game console Xbox.
Business as usual? Re-inventing the Zalando shopping experience? Optimizing our backend and Logistics? Not this week! At least not exactly :-) Zalando’s second Hack Week has officially been kicked off. If you are wondering what the hack week is all about: It’s an event where Zalando Technology staff creates, innovates and participates to various projects and events which are not necessarily connected to their daily work.
Python is great for writing command line scripts and we use it a lot for internal tools and scripts at Zalando. Before extending a three line Bash script I usually rethink and implement it in Python. This post summarizes some conventions and best practices I recommend. Command Line Options Do you know the command line options of GNU tar? Probably not all of them.
Big data and hadoop are catch-phrases these days in the tech media for describing the storage and processing of huge amounts of data. However, while you might be familiar with what is big data and hadoop, there is high probability that other people around you are not really sure on –What is big data, what hadoop is, what big data analytics is or why it is important.
Speaker: Alex Salazar, CEO & Co-Founder @ Arcade | Nate Barbettini, Founding Engineer @ Arcade | Tony Karrer, Founder & CTO @ Aggregage
There’s a lot of noise surrounding the ability of AI agents to connect to your tools, systems and data. But building an AI application into a reliable, secure workflow agent isn’t as simple as plugging in an API. As an engineering leader, it can be challenging to make sense of this evolving landscape, but agent tooling provides such high value that it’s critical we figure out how to move forward.
Imagine a world where everyone could print their own shoes. These guys are working on it. We're exploring 3D printing as a new way to quickly prototype designs.
Sending a shoe into space is no easy task, in just a few days a handful of developers, engineers, and product managers created this prototype that they will launch tomorrow! Here we take a behind the look at the team and the equipment the guys are using to send their shoe into space! Up, up and away!!
Customer satisfaction is key to Zalando. We're always looking at new ways to improve our service. This group uses Artificial Intelligence to create Zalanda, a friendly voice which you can take with you wherever you go.
Virtual reality is all the rage right now. And we have tons of spare Zalando cardboard boxes laying around. These creative folks put two and two together and are exploring new ways Zalando can use virtual reality.
Speaker: Andrew Skoog, Founder of MachinistX & President of Hexis Representatives
Manufacturing is evolving, and the right technology can empower—not replace—your workforce. Smart automation and AI-driven software are revolutionizing decision-making, optimizing processes, and improving efficiency. But how do you implement these tools with confidence and ensure they complement human expertise rather than override it? Join industry expert Andrew Skoog as he explores how manufacturers can leverage automation to enhance operations, streamline workflows, and make smarter, data-dri
Hack Week 3 starts today so it is now my pleasure to give you a short introduction what Hack Week is and what will happen at Zalando Technology in the next days. The idea If I have to summarize Hack Week in one sentence, I’d say it is similar to the Google Friday but here we have it for an entire week. The topics range from new digital shopping experiences like virtual reality dress rooms , trying on clothes digitally, robots, delivery by drones, and 3D-printing or experimenting with new technol
Fashion designers meet engineers! Our fashion designers from zLabels join hands with our engineers to take on smart wearables. If smart wearable electronics, circuit boards and soldering is your thing be sure to take a look at how one team tackles this futuristic problem.
How do you feel when you receive your favorite shoes from Zalando? I am sure you would be very excited to put them on as quickly as possible. But, in order to get those shoes out of the box, you shall also unpack the Zalando box. What if Zalando makes you happier and catches your emotions also when you unpack that box? A team, at Hack Week, is working hard to find out what would be the best unpacking experience for Zalando customers.
With a rapid pace in evolution of Big Data, its processing frameworks also seem to be evolving in a full swing mode. Hadoop (Hadoop 1.0) has progressed from a more restricted processing model of batch oriented MapReduce jobs to developing specialized and interactive processing models (Hadoop 2.0). With the advent of Hadoop 2.0, it is possible for organizations to create data crunching methodologies within Hadoop which were not possible with Hadoop 1.0 architectural limitations.
With Airflow being the open-source standard for workflow orchestration, knowing how to write Airflow DAGs has become an essential skill for every data engineer. This eBook provides a comprehensive overview of DAG writing features with plenty of example code. You’ll learn how to: Understand the building blocks DAGs, combine them in complex pipelines, and schedule your DAG to run exactly when you want it to Write DAGs that adapt to your data at runtime and set up alerts and notifications Scale you
Confused over which framework to choose for big data processing - Hadoop MapReduce vs. Apache Spark. This blog helps you understand the critical differences between two popular big data frameworks. Hadoop and Spark are popular apache projects in the big data ecosystem. Apache Spark is an improvement on the original Hadoop MapReduce component of the Hadoop big data ecosystem.
Hadoop is the way to go for organizations that do not want to add load to their primary storage system and want to write distributed jobs that perform well. MongoDB NoSQL database is used in the big data stack for storing and retrieving one item at a time from large datasets whereas Hadoop is used for processing these large data sets. For organizations to keep the load off MongoDB in the production database, data processing is offloaded to Apache Hadoop.
Click here to know more about our IBM Certified Hadoop Developer course 5 reasons why business intelligence professionals should learn hadoop from ProjectPro Online University PREVIOUS NEXT <
Tips for Developing Effective Big Data Applications USing Hadoop With so many use cases of big data and hadoop in an enterprise, there are several challenges hadoop developers need to overcome when identifying the use case and measuring its success rate. The largest barrier to enterprise adoption of hadoop is not having a clearly defined big data use case.
Many software teams have migrated their testing and production workloads to the cloud, yet development environments often remain tied to outdated local setups, limiting efficiency and growth. This is where Coder comes in. In our 101 Coder webinar, you’ll explore how cloud-based development environments can unlock new levels of productivity. Discover how to transition from local setups to a secure, cloud-powered ecosystem with ease.
The toughest challenges in business intelligence today can be addressed by Hadoop through multi-structured data and advanced big data analytics. Big data technologies like Hadoop have become a complement to various conventional BI products and services. Thus, several organizations are preparing for hadoop usage to get the most business value by integrating it into their business intelligence products or services.
When I was walking around in the Zalando offices searching for a Hack Week project to write about, I passed by a meeting room that instantly caught my attention. The door to the room had been left open and the team seemed to have gone out for lunch. I had actually already decided what to write about and wanted to visit the team that I had looked up in our Hack Week Wiki and interview them.
With big data gaining traction in IT industry, companies are looking to hire competent hadoop skilled talent than ever before. The best way to understand about the different technical professionals working with HDFS, MapReduce and the entire Hadoop ecosystem is to have a look at various Hadoop job descriptions -which is a mixed bag ranging from developers to data scientists.
The next decade of industries will be using Big Data to solve the unsolved data problems in the physical world. Big Data analysis will be about building systems around the data that is generated. Every department of an organization including marketing, finance and HR are now getting direct access to their own data. This is creating a huge job opportunity and there is an urgent requirement for the professionals to master Big Data Hadoop skills.
Large enterprises face unique challenges in optimizing their Business Intelligence (BI) output due to the sheer scale and complexity of their operations. Unlike smaller organizations, where basic BI features and simple dashboards might suffice, enterprises must manage vast amounts of data from diverse sources. What are the top modern BI use cases for enterprise businesses to help you get a leg up on the competition?
As technology evolves, eLearning is making a significant impact today with the promise to deliver greater benefits than classroom training. Professionals are now availing big data and hadoop training online from various eLearning websites to upgrade their IT skill set. There are several companies that have realized the importance of big data hadoop training online and are equipping their employees with hadoop skills to enhance the quality of service they provide to their customers.
We organize all of the trending information in your field so you don't have to. Join 37,000+ users and stay up to date on the latest articles your peers are reading.
You know about us, now we want to get to know you!
Let's personalize your content
Let's get even more personalized
We recognize your account from another site in our network, please click 'Send Email' below to continue with verifying your account and setting a password.
Let's personalize your content