The Data Engineer at CCL will be working with cloud-based platforms to build efficient, scalable, and secure data pipelines. This role requires strong expertise in AWS services, Python, PySpark, and ETL development, with an emphasis on ensuring data integrity and enabling automation within production environments. You’ll work collaboratively with data teams to integrate and process large data sets from diverse sources.


Key Responsibilities

  • Data Pipeline Development:
    • Build robust data pipelines using AWS services (such as AWS Glue, AWS S3, etc.) for efficient data processing and ETL (Extract, Transform, Load) tasks.
    • Develop Python and PySpark scripts for complex data transformations, ensuring high levels of performance and reliability.
  • Database and API Management:
    • Maintain databases for secure and efficient data storage, ensuring seamless transmission of data through APIs.
  • Pipeline Monitoring:
    • Monitor production pipelines for performance and data quality, developing tools and systems for effective pipeline management and troubleshooting.
  • Collaborative Work:
    • Work with cross-functional development teams, ensuring data pipelines meet integrity and scalability requirements.
    • Contribute to team discussions, code reviews, and brainstorming sessions on best practices for data engineering and architecture.

Skills, Knowledge, and Expertise

  • Technical Skills:
    • Strong experience in AWS, specifically in AWS Glue, PySpark, and other cloud computing solutions.
    • Proficient in Python and the development of scalable, efficient ETL data pipelines.
    • Excellent knowledge of data pipeline architecture, Big Data implementation, and data modeling.
    • Familiarity with Scrum/Agile methodologies for development and delivery.
    • Experience with databases (e.g., SQL, NoSQL) and managing large-scale datasets.
  • Additional Expertise:
    • Experience with data APIs and integrating multiple data sources.
    • Financial domain knowledge would be a bonus, but is not a requirement.
  • Soft Skills:
    • Outstanding communication and interpersonal skills.
    • Ability to collaborate with global teams and manage multiple tasks in a fast-paced environment.
    • Fluent in English (both written and spoken).

Qualifications

  • Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent practical experience.
  • A minimum of 2 years’ experience in a similar data engineering role and at least 3 years overall in the IT field.
  • Proven track record of developing and managing complex data pipelines in cloud environments, specifically with AWS.

Why Join Cybernetic Controls Ltd?

  • Competitive Salary: Enjoy a competitive salary with opportunities for growth.
  • Remote Work: Work remotely from anywhere in Pakistan.
  • Challenging Work: Take on engaging and complex data engineering projects within the finance industry.
  • Growth Opportunities: CCL is growing rapidly, offering you the chance to advance your career within a dynamic and collaborative environment.

How to Apply

Interested candidates should submit their CV and cover letter to Aqib Shah (Recruitment Lead) , or apply directly on our website. In your application, please ensure your CV highlights your relevant work experience, skills, and certifications that demonstrate your expertise in data engineering and cloud technologies.

Asad Hameed
Apply for this job
Submission
Gender?

Leave a Comment