This job is closed Remote Job
This job is closed. But you can apply to other open Developer / Engineer jobs.

Senior Data Engineer

At AstroPay, we believe in empowering people to reach their full potential and to be part of an innovative and forward-thinking company. Our goal is to provide a cutting-edge online payment solution that goes beyond just a traditional wallet. We are dedicated to creating a dynamic and challenging work environment that fosters creativity, innovation, and a strong sense of community among our team.

Our multinational and multicultural team is made up of talented and motivated individuals who are passionate about delivering the best possible experience to our customers and users. We value teamwork, collaboration, and a can-do attitude, and we’re always looking for new talent to join our growing company.

If you’re looking for an exciting opportunity to work with a dynamic and innovative company, AstroPay is the perfect place for you. With our entrepreneurial spirit and drive to succeed, we offer an environment where you can grow both personally and professionally. Join us today and be part of our mission to revolutionize the online payment industry.

Primary Responsibilities & Expectations:

  • Designing and Developing Data Pipelines and ETLs: You will play a crucial role in designing, building, and maintaining data pipelines and ETLs/ELTs (using tools like AWS DMS, Airflow and other AWS services). 

  • Manage ingestion of raw transactional and non-transactional data into an S3 data lake. 

  • Ensure efficient data flow across various systems within AstroPay, enabling data-driven decision-making.

  • Data Lake and Warehouse integration: As a Lead / Senior Data Engineer, you will be responsible for managing and enhancing our data architecture combining and integrating a data lake and a relational data warehouse. 

  • Ensure seamless integration of raw data into the S3 data lake before applying necessary transformations and delivering clean, structured data to downstream systems. We need to handle mostly batch processing but also there are some requirements for streaming.

  • Data Modeling: Apply transformation, enrichment, and data processing using the most appropriate tools for the workload. We follow medallion architecture principles. 

  • Apply best practices in data engineering, including version control, testing, and documentation.

  • Collaboration with Cross-Functional Teams: Collaborate closely with engineers, data analysts, infrastructure and other stakeholders to understand their data needs and deliver solutions that enable data-driven decision-making across the organization. 

  • Data Governance and Security:  Ensure and implement data governance best practices. Establish and maintain a secure data environment to comply with regulations in different countries.

  • Create Reusable Data Assets: Develop and implement strategies for creating reusable data assets, promoting efficiency, and reducing redundancy in data processing.

  • Build Trust in Our Data: Implement measures to enhance data quality and build trust in the accuracy and reliability of our data. This includes data validation, quality checks, and documentation.

Core Competencies and Skills:

Must-Have:

  • Expertise in Data Pipelines: Proven experience designing, developing, and maintaining data pipelines for diverse workloads using AWS services like DMS, Airflow, Lambda, and Glue.

  • Data  Lakes and Warehouse Architecture: Proven experience managing data lakes and relational data warehouses. Familiarity with Apache Iceberg, S3-based data lakes, and Redshift.

  • AWS Cloud Knowledge: Extensive experience with AWS services, including Redshift, Lambda, Airflow, DMS, and S3. You should be comfortable working with these services to build and maintain data solutions.

  • Collaborative Problem Solver: Strong communication skills to collaborate effectively with cross-functional teams and proactively own data challenges.

  • Software engineering background: Solid foundation in software engineering principles, to apply software engineering best practices to our data architecture and processes. Some Python experience is also required.

  • English: to interact with the rest of the company.

Nice to Have:

  • Familiarity with Database Administration to fine-tune database performance and ensure data integrity.

  • Familiarity with Infrastructure Management and other AWS services like CodeCommit, SageMaker, CloudWatch, StepFunctions, Athena, and Glue, can be a valuable addition to your skill set.

  • Familiarity with BI Tools:  Looker or other similar BI tools.

  • Build data models: Experience building data models, understanding of data catalogs, data governance, and data lineage.

  • Fintech: Prior experience working in the fintech ecosystem will be considered an advantage.

Benefits:

  • Flexible hours: We are results-oriented.

  • Professional growth: Take off your professional career. Explore your passions.

  • Fully remote: Work from anywhere.

  • AstroTeam: Get in touch with your team and have fun.

  • AstroPay House: Meet and connect with AstroPayers in all the world.

  • Training: Keep on building your knowledge with EDX platform.

This job is closed
But you can apply to other open Remote Developer / Engineer jobs