Our client is now recruiting for a Cloudera Hadoop Data Architect to be based in Riyadh, KSA
Duration – 9 Months to 12 Months (Extendable)
We are seeking a motivated and experienced Cloudera Hadoop Data Architect for our Data Team. The Data Architect will align with enterprise strategy, elicit strategic data requirements, and outline detailed integrated designs to meet those business requirements.
- Design and implement Cloudera Hadoop based big data solutions
- Strong understanding of Cloudera CDP platform including components like HDFS, Hive, Sqoop, Flume and other projects required to manage the ETL and data management from data sources to CDP
- Should be able to define the end state architecture and sizing for Cloudera (On premise)
- Should have good understanding of data integration best practices to define the ETL strategy from data sources to the Cloudera data lake
- Experience in Hadoop Technologies - HDFS, Map Reduce, Hive, H-base, Cloudera manager.
- Good working knowledge of Pig Scripting, Oozie workflow and HBASE.
- Knowledge of Banking is an added advantage along with good team leading and communication skills.
- Work closely with Business Partners and Analysts to understand data needs, document requirements, and develop detailed solutions.
- Analyze Source Systems and perform hands on Data Analysis
- Establish, maintain, and adhere to Enterprise Data Standards & Policies.
- Implementation experience of data archival strategy in Hadoop
- Define data governance standards
- Strong analytical and problem-solving skills.
- Strong written and verbal communication skills.
- Ability to work effectively under pressure with constantly changing priorities and deadlines.
- Familiarity with project management and systems development life cycle processes, tools, concepts, and methodologies is plus.
- Ability to work independently