Posting date: 27-08-2018 | Closing date: 31-08-2018
private and confidential
Senior Developer/ Tech Lead
(Kuala Lumpur, Nationwide )
Job Descriptions

Contract : 12 months

Location : Kuala Lumpur

Benefits : Well paid salary with EPF and SOSCO

Nationality : Malaysian

Experience : 2 to 5 years

Job Purpose:

  • The job requires to work on data ingestion activities to bring large volumes of data in to our Big Data Lake.
  • The role plays a vital role in building new data pipelines from various structured and unstructured sources into Hadoop/big-data.
  • You will be working closely with data consumers and source owners to create the foundation for data analytics and machine learning activities.

You will be focused on:

  • Identifying data ingestion patterns and build framework to efficiently ingest the data to our Data Lake
  • Data modelling across finance, risk and compliance business data in the Bank
  • Performance tuning ingestion jobs to improve throughput and analyzing huge data sets
  • Improving ELT process by automating built, test and deployment framework
  • Optimizing and monitoring the performance of the bigdata system.
  • Integrate data from source systems with the Banks standard architecture.

Key Responsibilities :

  • Design and develop processes that leverage Hadoop/CouchDB/Mongo/etc. to create data stores that enable business objectives.
  • Provide technical expertise and recommend technologies that drive value, performance, and capabilities.
  • Work as a member of cross-functional teams including other Development teams, Quality Assurance, Release Engineering, and Support to create reliable, scalable, and supportable products and capabilities.
  • Implementing ETL process Monitoring performance and advising any necessary infrastructure changes Defining data retention policies, ITIL service management and operating model.
  • Synchronizes with business requirements and business plans and meeting their associated big data and data science requirements
  • Ensures all Group data and operations are in compliance with Group, local and regional regulations
  • Any other responsibilities / task as assigned by management from time to time.
  • The job requires to work on data ingestion activities to bring large volumes of data in to our Big Data Lake.
  • The role plays a vital role in building new data pipelines from various structured and unstructured sources into Hadoop/big-data.
  • You will be working closely with data consumers and source owners to create the foundation for data analytics and machine learning activities.

 

Job Requirements

Required Skills : Hadoop, Big Data, ETL techniques, Unix/Linux, Java, Scala/Spark, Hbase, Cassandra, HiveQL/Hive UDFs, RDBMS( SQL, Oracle, MS SQL Server)

 

Only shortlisted candidates will be notified.

Interested applicants are invited to email with comprehensive resume including current and expected salary to: sharmileehr@gmail.com

private and confidential
55100, Kuala Lumpuu
kuala lumpur

Email: sharmileehr@gmail.com


Job Ref.
: Hadoop/Big Data
Salary
: 10000
Employment Type
: Contract or Temporary
No of Vacancies
: 2
Min. Qualification
: Bachelor Degree
Min. Experience
: 5 Year(s)