Job Description
Experience : 5+
Job Summary:
We are looking for an experienced Python Developer with a strong background in data integration to join our dynamic team. The ideal candidate will play a key role in developing robust ETL/ELT pipelines, ensuring efficient data flow and integration. If you have a passion for Python coding, expertise in database systems, and a solid understanding of ETL/ELT processes, we encourage you to apply.
Requirements and Responsibilities:
Utilize Python coding expertise to design, implement and optimize data processing solutions.
Develop and maintain ETL/ELT pipelines for seamless data integration.
Work with databases, including PostgreSQL, MySQL, or MS SQL, to manage and manipulate data effectively.
Utilize AWS Services, particularly Lambda, for scalable and efficient data processing.
Implement integration solutions using API Gateways.
Write documentation regarding data processing solutions.
Apply Unix knowledge to facilitate data processing and system operations.
Apply software engineering best practices throughout the development life cycle, including source control, build automation, testing, and performance tuning.
Work closely with cross-functional teams to integrate data solutions into business processes.
Nice to Have:
Knowledge of data warehousing products like AWS Redshift, GCP BigQuery, or Oracle DW.
Experience working with customer and behavioral datasets, and familiarity with common CRM and sales patterns.
Experience with GCP, Azure
Experience in developing and maintaining web applications like Django framework.
If you are passionate about data integration, Python development, and leveraging cloud services, and you have a keen interest in contributing to cutting-edge projects, we welcome your application. Join us as we drive innovation through seamless data solutions.