
Niklas Lang
Articles
-
2 months ago |
towardsdatascience.com | Niklas Lang |Roger Noble |Mikkel Dengsøe |Rohan Paithankar
As we have already seen with the basic components (Part 1, Part 2), the Hadoop ecosystem is constantly evolving and being optimized for new applications. As a result, various tools and technologies have developed over time that make Hadoop more powerful and even more widely applicable. As a result, it goes beyond the pure HDFS & MapReduce platform and offers, for example, SQL, as well as NoSQL queries or real-time streaming.
-
2 months ago |
towardsdatascience.com | Niklas Lang |Roger Noble |Mikkel Dengsøe |Rohan Paithankar
Nowadays, a large amount of data is collected on the internet, which is why companies are faced with the challenge of being able to store, process, and analyze these volumes efficiently. Hadoop is an open-source framework from the Apache Software Foundation and has become one of the leading Big Data management technologies in recent years. The system enables the distributed storage and processing of data across multiple servers.
-
Jan 16, 2025 |
towardsdatascience.com | Niklas Lang
An in-depth article about dimensionality reduction and its most popular methodsDimensionality reduction is a central method in the field of Data Analysis and Machine Learning that makes it possible to reduce the number of dimensions in a data set while retaining as much of the information it contains as possible. This step is necessary to reduce the dimensionality of the dataset before training to save computing power and avoid the problem of overfitting.
-
Dec 12, 2024 |
towardsdatascience.com | Niklas Lang
Introduction to activation functions and an overview of the most famous functionsNeural networks have become a powerful method in machine learning models in recent years. The activation function is a central component in every neural network, which significantly influences the model’s functionality. It determines how strongly a neuron in the network is activated and thus decides which structures are learned from the data.
-
Nov 25, 2024 |
towardsdatascience.com | Niklas Lang
Discover the role of batch normalization in streamlining neural network training and improving model performanceBatch normalization has become a very important technique for training neural networks in recent years. It makes training much more efficient and stable, which is a crucial factor, especially for large and deep networks. It was originally introduced to solve the problem of internal covariance shift.
Try JournoFinder For Free
Search and contact over 1M+ journalist profiles, browse 100M+ articles, and unlock powerful PR tools.
Start Your 7-Day Free Trial →