Set up, test and deploy a leading-edge big data environment for our clients…Big data architecture, Python, Spark, Hadoop.
- Work with us and not for us! We are in this together.
- Change the world of big data and analytics by following your passion for leading-edge technology, such as machine learning and Artificial Intelligence.
- Participate in a state of the art development process based on CI CD and Docker solutions.
- Have access to the latest knowledge and newest technology due to our partnership with EIT (European Institute of Innovation and Technology) that allows you to work together with other high-tech companies and (technical) universities around Europe.
- Work in an energetic & fun team that consists of commercial and technical people (including the brightest data/brain scientists).
Odyssey Prime is a tech start-up with a mission! We are building the leading analytics platform of the future that helps large companies (our clients) to stay on top of their game in a world that is increasingly data-driven. We make this happen by providing our clients with valuable (business) insights via our very own analytics platform. Among other things, we generate nifty reports, perform advanced analysis and enable the discovery of innovative solutions. We strongly believe that true success lies in ’empowering people’ and therefore put personalization, intuitive design and making difficult things look easy at the heart of everything we do. The European Institute of Innovation and Technology (EIT) has selected us as a high potential start-up and we’re proud of that!
We have a professional, progressive culture, in which ambition, intrinsic motivation and cooperation are our key values. We are transparent, innovative and challenge each other to achieve maximum results. Last but not least, we consider work and play as two sides of the same coin and believe in having some serious fun!
We are looking to extend our team with a data engineer who can help us setup our Big Data environment. Using the Docker technology we are able to setup a complete datalake, processing and reporting environment per customer within minutes. Our vision is to enable our customer with self-service components from the data ingestion, preparation, analysis to the visualisation of insights. For the initial data gathering customers should be able to connect any system and/or database to their private datalake and select the data they need for processing in Python and Spark. You will be part of the team that will research, test and deploy the best environment to do so. The implementation of different models to process the customer’s data is also part of your daily activities. This process is done in collaboration with the other data scientists and researchers.
- Python experience
- Spark experience
- Hadoop experience
- Confident with command line tools
- Knowledge of (no- SQL) databases and / or data models. Data security and privacy ( data security , Algorithms for data integration / capture )
- A completed Bachelor or Master’s degree in data/information science or a similar field
- A passion for new technology and clean, well-engineered code
- Familiar with Agile development
- Optional (bonus for us):
- Experience with Scala/JAVA
- Experience with PySpark
- Experience with MongoDB
- Knowledge of machine learning (several regression and classification techniques)
- Experience with one or more of these: Pig, Hive , Yarn , Flume , Kafka, Storm, Sqoop, Hue, Zookeeper, Impala
- At least 30 holidays a year, days not used will be donated to charity
- You define your own working hours as long as the job gets done
- Travelling cost compensation
- Friendly colleagues that will support and challenge you
- Training & development opportunities
- Free lunch
- Fitness at the office
- Regular fun events
- Competitive salary
- Send your CV to apply at firstname.lastname@example.org
- 1st Interview in the Amsterdam Office with the lead developer
- 2nd Interview in the Amsterdam Office you will do a coding assignment. If we like your code we will offer you a contract the same day and invite you to have a beer with us.