Hadoop Developer

  • Anywhere

About Our Company

Our multi-award winning organisations Mekong Big Data, DataU Academy and DOM.agency are rapidly evolving the Kingdom’s technology and data sectors; ushering in a new era of growth, productivity and prosperity.

We are a dynamic, international team that seeks excellence in our project management and the impact we have on our clients’ businesses. We are committed to training and learning in our DNA as a business so you will learn a lot and apply it to client projects on a continual basis.

With a firm dedication to best practices, staff & team development and a company culture centred around the collective power of collaboration and innovation, we are seeking the brightest, sharpest and most eager minds to join our tribe.

As we embark on regional and international expansion, we’re seeking the brightest, most passionate and growth-orientated minds to join our mission and change the future of Cambodia and beyond.

Job Description Summary

We are seeking for Hadoop Developers who will help us architect, design, and develop big data solutions and its infrastructures to capture, store and retrieve vast amounts of data, ensure functionality, and help us deliver state-of-the-art big data products. Knowledge of existing tools is essential, as is the capacity to write bespoke software and solutions using the Hadoop API. Your primary responsibility will be to design, build, and maintain Hadoop infrastructure. You may also be required to evaluate existing data solutions, write scalable ETLs, develop documentation, and train staff. You have the opportunity to apply your skills and impact client’s businesses. This is both a “hands-on” and client-facing position requiring solid technical skills as well as excellent soft skills like interpersonal skills and communication skills.

The ideal candidate will be autonomous, creative and self-motivated to deliver excellence in their work. You will join a team that is passionate about teamwork and supporting one another under the spirit of Ubuntu philosophy. This position position is also open to remote work applicants.

Responsibilities

  • Create and implement big data solutions based on the business problems, requirements and needs

  • Provide advisory and consultation on big data engineering and Hadoop infrastructure while liaison among stakeholders to drive implementation success

  • Design and implement big data infrastructure solutions to meet clients’ usability and performance needs

  • Designing and coding Hadoop applications to analyze data collections.

  • Design and evaluate data models, review and improve functional data models created by Data Engineers, Juniors, and interns

  • Design scalable ETL/ELT data pipelines and apply ETL/ELT best practices for data pipelines within the big data architecture

  • Understand Hadoop’s security mechanisms and implement Hadoop security.

  • Build, operate, monitor, and troubleshoot Hadoop infrastructure.

Skills and Qualifications

  • Bachelor degree or higher in relevant subjects such as software engineering, data engineering, computer science, statistics, economics, finance, mathematics or related fields

  • Previous experience as a Big Data Engineer/Hadoop Developer – A plus if it is in the finance/banking/Insurance sector or consulting company

  • Advanced knowledge of the Hadoop ecosystem and its components.

  • In-depth knowledge of Hive, HBase, and Pig.

  • Familiarity with MapReduce and Pig Latin Scripts.

  • Experience with programming languages:  SQL, T-SQL, Python

  • Experience with data analytics and visualization tools like MS Power BI, Google Data Studio, Tableau, Grafana, Apache Superset, etc.

  • Familiarity with common database systems

    • Relational Databases (RDMS): Postgres, MySQL, Oracle, MS SQL Server

    • Cloud computing and database: Redshift, Azure SQL Database, Amazon S3, etc.

    • NoSQL Databases: MongoDB, Cassandra, HBase

    • Graph Database: Neo4j, Amazon Neptune, Redis, etc.

  • Strong desire to establish standards of best practice, automation workflow and framework

  • Soft and interpersonal skills:

    • Great communication skills; able to present results to non-technical audience

    • Excellent leadership and mentoring skills

    • Attention to detail. Be a detective to identify and fix database problems and data quality

    • Creativity in solving problems; think inside and outside the box to connect dots

    • Positive and can-do attitude yet humble; get excited about learnings and challenging work

    • Ability to abstract general principles from specifics

Benefits

  • Competitive salary

  • 28 days leave (inclusive of Cambodian national holidays)

  • Professional development fund

  • Team social calendar

  • Flat, respectful, and supportive management structure (Ubuntu philosophy)

To apply for this job email your details to adjani@mekongbigdata.com