Principal Infrastructure Engineer (Hadoop)

Prague, Hlavní město Praha, Czechia Global Business Services 26024 24/03/2020
Apply

SUMMARY 

Begin your career with DHL IT Services!

We are seeking for a talented Principal Infrastructure Engineer to join our team. You are expected to deliver technical and architecture design and standards and to provide 3rd line operational support. You will act as the SME (Subject Matter Expert) for the deployment of Data Lake and Big Data solutions, to support business initiatives both at Global and Regional level. You will also provide technical consultancy to internal/external DHL customers to support the implementation planning for the successful deployment of new infrastructure technologies. 

WHAT YOU WILL DO

Selecting and integrating Big Data tools and frameworks required to provide requested capabilities

Work on collecting, storing, processing, and analyzing huge sets of data across DHL and its applications/infrastructure

You will build databases and tables, as well as develop data integration and ingestion processes. 

Design and build big data infrastructure platform, primarily based on MapR/Hadoop technologies, ensuring that the infrastructure is highly-available and secure

Develop/build security compliant user management framework for multi-tenant big data platform

Collaborate with other technology teams and architects to define and develop cross-function technology stack interactions

Define technical requirements, conduct performance tuning and provide analytical and business user support.

Capable to take on other database engineering related tasks which includes but not limited to deploy databases for business services, review data design and tune SQL if required

Ensure smooth-running of new Data Lake services and existing services in the production/ development environments 

Develop and maintain build standards relevant for Data Lake Infrastructure platforms 

Design and implement infrastructure improvements such as automation and/or shared platforms

Responsible for the implementation of solutions which are scalable and supportable and aligned with global standards and industry best demonstrated practices

APPLICATION YOU WILL USE

Hadoop Eco-System Components (Hadoop, MRv2, Yarn etc)

MapReduce & MapR

Spark SQL

Scala Programming Language

Apache Spark

No SQL Databases – Hbase/MongoDB

Hive, HiveQL and Impala

Apache Kafka

Apache Drill

RDBMS - Oracle Database, MySQL, Microsoft SQL

Operating System – Red Hat Linux, Microsoft Windows

Monitoring software – SPLUNK, HP Omi

WHAT YOU SHOULD HAVE

Minimum of 7 years of experience as a technology leader designing and developing data modeling with a minimum of 3 of those years specializing in big data architecture or data analytics (should include technologies such as Hadoop, NoSQL and Map-Reduce and other Industry BigData Frameworks)

Familiarity with Hadoop, MapR open source stack including YARN, Kafka, Hive, Pig as well as relational databases

Broad based architecture acumen: Database architecture, ETL, SOA, Cloud, etc

Comfortable in multi-terabyte production environments

Highly proficient with large data sets and clusters as well as programming integration solutions

Demonstrated skill in critical thinking and problem solving methods

Uses appropriate facilitation techniques to gain agreement or move others to action

Ability to work successfully in a global team environment

WHAT IS THE PLUS POINT(S)

Familiarity with various tools such as AWS, Mesos or Docker and an instinct for automation 

Experience with Java oriented technologies (JBoss, Spring, SpringMVC, Hibernate, REST/SOAP)

Advanced working SQL knowledge and experience working with relational databases (Oracle, MS SQL, MySQL, Informix), query authoring (SQL) as well as working familiarity with a variety of databases

Object-oriented/object function scripting languages such as Python, Java, C++, etc.

WHAT YOU WILL GET FROM US

Internal and external trainings

Great team of professionals and possibility of development

Full support from colleagues during onboarding

Modern offices in Chodov next to metro station

Huge number of internal job opportunities within the company

Home office

Permanent contract

Company car or car allowance

CAFETERIA employee benefit program with wide selection of benefits from Edenred

Extra week of holiday (25 days/year), 6 Self-sickness days/year, Full salary compensation for up to 10 days absence due to illness per calendar year, Lunch vouchers fully covered by company

Multisport card, mobile and laptop, fruit days, sport clubs for employees, Referral program……

Smart casual dress code

Sounds good? Start your application now!


DHL IT Services – About Us

IT Services is the internal provider of specialized IT Build services and industrialized IT Run services to Deutsche Post DHL (DPDHL) Group:

Supports over 260,000 DPDHL e-mail users; 

Runs more than 7700 servers;

Supports more than 2000 global services and applications;

Processes 9 million shipment information messages per day;

200000 man days per year of development application.

Apart from being more than 4.500 highly skilled IT professionals with an intimate knowledge of the logistics industry, we at IT Services altogether represent more than 80 nationalities. IT Services is working behind the scenes 24-hours a day, 7 days a week, 365 days a year in the data centers and offices across three continents – Americas (Mechanicsburg, Westerville, Tempe in USA and Mexico, Costa Rica and Brazil), Europe (Prague in Czech Republic; Bonn and Darmstadt in Germany) and Asia (Cyberjaya in Malaysia and Chennai in India).


Facts and Figures

  • Working Hours

    40

  • Business Unit

    DHL Information Services (Europe) s.r.o.

  • Travel Required

    less 20%

  • Employment Type

    Permanent Full-Time

  • Shift Requirement

    None

Apply