Data Engineer (SQL, Databricks) (Remote, Full-Time)

Full–time

Posted on: 6 days ago

You will be joining Smart Working, a company that values your personal and professional growth, offering you a sense of belonging in a remote-first world. As a Data Engineer, your role will involve building and maintaining scalable data pipelines, integrating systems into the Delta Lake environment, and supporting advanced analytics and reporting. You will collaborate with a team to ensure reliable data flows and develop data models that enable new business metrics.
  • *Responsibilities:**
  • - Design, build, and maintain data pipelines connecting internal systems to the Delta Lake environment
    - Develop SQL-based data transformations and relational data models for analytics and reporting
    - Integrate new data sources and systems into the data platform
    - Ensure reliable, scalable, and well-structured data flows across systems
    - Support the development of data architecture and models for new business capabilities
    - Monitor, troubleshoot, and improve existing pipelines for data accuracy and reliability
    - Contribute to documentation, data standards, and best practices
    - Collaborate with stakeholders to meet evolving business requirements
  • *Requirements:**
  • - Strong SQL experience for querying, transformation, and data modeling
    - Hands-on experience with Databricks for data pipelines and analytics workflows
    - Experience with large-scale data platforms or data lake architectures
    - Familiarity with Apache Spark for distributed data processing
    - Designing and maintaining data pipelines and ETL/ELT processes
    - Ability to work collaboratively in cross-functional engineering teams
    - Attention to detail in building reliable data infrastructure
  • *Nice to Have:**
  • - Experience using Python for data processing and pipeline development
    - Familiarity with API development and integrating external data sources
    - Exposure to Delta Lake or similar modern data lake architectures
    - Building analytics-ready data models for reporting and business intelligence
    - Interest in enhancing data engineering practices

    At Smart Working, you will benefit from fixed shifts, no weekend work, day one benefits including a laptop and medical insurance, and support through mentorship and community. You will be part of a culture that values integrity, excellence, and ambition, where your contributions are truly valued. If you are ready to be a valued and empowered Smart Worker in a company that celebrates true belonging, integrity, and ambition, Smart Working is the place for you. You will be joining Smart Working, a company that values your personal and professional growth, offering you a sense of belonging in a remote-first world. As a Data Engineer, your role will involve building and maintaining scalable data pipelines, integrating systems into the Delta Lake environment, and supporting advanced analytics and reporting. You will collaborate with a team to ensure reliable data flows and develop data models that enable new business metrics.
  • *Responsibilities:**
  • - Design, build, and maintain data pipelines connecting internal systems to the Delta Lake environment
    - Develop SQL-based data transformations and relational data models for analytics and reporting
    - Integrate new data sources and systems into the data platform
    - Ensure reliable, scalable, and well-structured data flows across systems
    - Support the development of data architecture and models for new business capabilities
    - Monitor, troubleshoot, and improve existing pipelines for data accuracy and reliability
    - Contribute to documentation, data standards, and best practices
    - Collaborate with stakeholders to meet evolving business requirements
  • *Requirements:**
  • - Strong SQL experience for querying, transformation, and data modeling
    - Hands-on experience with Databricks for data pipelines and analytics workflows
    - Experience with large-scale data platforms or data lake architectures
    - Familiarity with Apache Spark for distributed data processing
    - Designing and maintaining data pipelines and ETL/ELT processes
    - Ability to work collaboratively in cross-functional engineering teams
    - Attention to detail in building reliable data infrastructure
  • *Nice to Have:**
- Experience using Python for data processing and pipeline development
- Familiarity with API development and integrating external data sources
- Exposure to Delta Lake or similar modern data lake architectures
- Building analytics-ready data models for reporting and business intelligence
- Interest in enhancing data engineering practices

At Smart Working, you will benefit from fixed shifts, no weekend work, day one benefits including a laptop and medical insurance, and support through mentorship and community. You will be part of a culture that values integrity, excellence, and ambition, where your contributions are truly valued. If you are ready to be a valued and empowered Smart Worker in a company that celebrates true belonging, integrity, and ambition, Smart Working is the place for you.