Компания "ТРТ"
Our client is a leader in the single-family rental (SFR) investment market, offering a comprehensive platform designed to make real estate investing more accessible, cost-effective, and straightforward. They combine a deep passion for helping investors build wealth through real estate with cutting-edge technology that redefines the investment process.
About the team:
The Data Engineering team is the core of Data, which everything else relies upon. The team is responsible for the development and management of the Enterprise Data Platform, which powers the company and all respective Data functions. The Enterprise Data Platform is crucial for integrating, managing, and providing data across the business. There are multiple sub-disciplines within Data Engineering, each contributing to the overall effectiveness and efficiency of data operations. It is a highly cohesive team consisting of the 4 pods, Data Infrastructure, ML & GenAI Ops, Analytics Engineering & Data Services.
They architect and build the core data infrastructure to support the entire company, build data ingestions from internal & external applications, support infra for ML & GenAI products and applications, merge various data feeds and combining them into easy to use, valuable data sets to support analytics and design and create scalable and packaged data solutions in the form of various data services.
About the role:
We are looking for a talented Senior Data Engineer to join the Data Services pod in the established Data Engineering team.
As a Senior Data Engineer, you will be instrumental in architecting and constructing a new version of the data services platform, Data Services 2.0!
They operate a modern all-cloud data stack that includes AWS, Airflow, Docker, DBT, Python, Snowflake, Sigma, Java/Kotlin and our old friend SQL.
What you will do:
● Improve and maintain the data services platform.
● Deliver high-quality data services promptly, ensuring data governance and integrity while meeting objectives and maintaining SLAs for data sharing across multiple products.
● Develop effective architectures and produce key code components that contribute to the design, implementation, and maintenance of technical solutions.
● Integrating a diverse network of third-party tools into a cohesive, scalable platform, optimizing code for enhanced scalability, performance, and readability.
● Continuously improving system performance and reliability by diagnosing and resolving unexpected operational issues to prevent recurrence.
● Ensuring that your team’s work undergoes rigorous testing through repeatable, automated methods.
● Support data infrastructure and rest of the data team who designs, implements and deploys, scalable, fault-tolerant pipelines that ingest, and refine large diverse (structured, semi-structured and unstructured datasets) into simplified accessible data models in production.
● Collaborate with cross-functional teams to understand data flows and design, build and test optimal solutions for engineering challenges.
● Operate within an Agile/Scrum framework, working closely with Product and Engineering teams to deliver value across multiple services and products.
● Influence and shape the enterprise data platform and services roadmap, architecture, and design standards. Collaborate with technology leaders and team members to design, adapt, and enhance the architecture to meet evolving business needs.
Qualifications:
● BS or MS in a technical field: computer science, engineering or similar.
● 8+ years technical experience working with data.
● 5+ strong experience building scalable data services and applications using either SQL, Python, Java / Kotlin, with the interest and aim to learn additional tools and technologies.
● Deep understanding of microservices architecture and RESTful API development including gRPC, REST/SOAP, GraphQL.
● Experience with AWS services including Messaging such as SQS, SNS, and familiarity with real-time data processing frameworks such as Apache Kafka or AWS Kinesis.
● Significant experience building and deploying data-related infrastructure, robust data pipelines (beyond simple API pulls) & ETL/ELT code encompassing messaging, storage, compute, transformation, execution.
● Experience in identifying and proposing initiatives aimed at enhancing the performance and efficiency of existing systems, setting the standard for SLAs & SLOs.
● Strong communication and interpersonal skills.
● Experience managing a team or experience working with an on-shore/off-shore model is a plus.
Nice-to-have:
● Knowledge of AWS and Azure cloud services.
● Previous experience in a start-up or agile environment.
● Additional experience with Snowflake and Airflow is desirable.
Location: This is a remote position; however, the ideal candidate should be available to work from 9 am to 12 pm Pacific Time.
26 Ноября
DevOps Engineer/Infrastructure Engineer
Тбилиси
Компания "Flipper Devices" Привет! На связи команда Flipper Devices . Мы разрабатываем Flipper Zero — мультитул для хакеров и экосистему...
26 Ноября
Senior Frontend developer( Марадисо )
Тбилиси
Компания "Марадисо" В поиске Senior Frontend developer в сформированную команду разработки продукта! Стек: TypeScript; React,...
26 Ноября
Тбилиси
Компания "Net Element Inc" Американская компания в связи с активным ростом проекта приглашает в команду тестировщика (QA Engineer, tester)....
26 Ноября
Тбилиси
Компания "Gameteq" GAMETEQ — геймдев-компания, которая специализируется на высококачественных мидкорных 4Х-стратегиях для мобильных устройств...
26 Ноября
Тбилиси
Компания "Noveo" НОВЕО - международная компания, с 2002 года предоставляющая услуги по заказной разработке ПО для западноевропейского рынка....