Data engineer, experience 5+ years IRC228024 Data engineer, experience 5+ years IRC228024 with verification
Description:
Strong experience with Postgres and SQL and experience working with, supporting and maintaining relational databases.
Experience building and optimizing ‘big data’ data pipelines, architectures, and data sets.
Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
Experience with AWS cloud services
Experience in migrating data across the database/environments with zero downtime.
Experience with AWS integration tool, AWS Glue.
Experience supporting complex ETL & ; Integration Projects
Create and maintain comprehensive documentation for advanced analytics models, data dictionaries, and data flows.
Hands on engineering of solutions and data pipelines utilizing CI/CD and automation.
Design and maintain testing processes and automation workflows for data feature changes, upgrades, and releases.
Strong analytic skills related to working with semi structured datasets.
Build processes supporting data transformation, data structures, metadata, dependency and workload management.
A successful history of manipulating, processing and extracting value from large, disconnected datasets.
Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
Strong project management and organizational skills.
Experience supporting and working with cross-functional teams in a dynamic environment.
Others: Vagrant, GitHub, Docker, Terraform .
We are looking for a candidate with 5+ years of experience working in ‘Big Data’, Engineering, Administrator or Developer role. They should also have experience with one or more of following software/tools:
Experience with relational SQL and NoSQL databases, including Postgres.
Experience With Cloud Technology (AWS Cloud Preferred).
Experience with Snowflake & dbt.
Experience with data integration tools, ETL frameworks, and data pipeline orchestration tools (AWS Glue, AirFlow).
Proficiency in leading data visualization tools (Tableau or Power BI).
Relevant tertiary qualification.
Ability to independently lead working sessions.
Excellent communication skills both written and verbal.
Payments industry experience.
Requirements:
Strong experience with Postgres and SQL and experience working with, supporting and maintaining relational databases.
Experience building and optimizing ‘big data’ data pipelines, architectures, and data sets.
Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
Experience with AWS cloud services
Experience in migrating data across the database/environments with zero downtime.
Experience with AWS integration tool, AWS Glue.
Experience supporting complex ETL & ; Integration Projects
Create and maintain comprehensive documentation for advanced analytics models, data dictionaries, and data flows.
Hands on engineering of solutions and data pipelines utilizing CI/CD and automation.
Design and maintain testing processes and automation workflows for data feature changes, upgrades, and releases.
Strong analytic skills related to working with semi structured datasets.
Build processes supporting data transformation, data structures, metadata, dependency and workload management.
A successful history of manipulating, processing and extracting value from large, disconnected datasets.
Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
Strong project management and organizational skills.
Experience supporting and working with cross-functional teams in a dynamic environment.
Others: Vagrant, GitHub, Docker, Terraform .
We are looking for a candidate with 5+ years of experience working in ‘Big Data’, Engineering, Administrator or Developer role. They should also have experience with one or more of following software/tools:
Experience with relational SQL and NoSQL databases, including Postgres.
Experience with cloud technology (AWS cloud preferred).
Experience with Snowflake & dbt.
Experience with data integration tools, ETL frameworks, and data pipeline orchestration tools (AWS Glue, AirFlow).
Proficiency in leading data visualization tools (Tableau or Power BI).
Relevant tertiary qualification.
Ability to independently lead working sessions.
Excellent communication skills both written and verbal.
Payments industry experience.
Job Responsibilities:
Operations & Administration
Create and maintain optimal data pipeline architecture, combining raw information from different sources.
Keep our data separated and secure across national boundaries through multiple data centers and cloud regions.
Define users and enable data distribution to the right user, in appropriate format and in a timely manner.
Minimise database downtime and manage parameters to provide fast query responses.
Perform tests and evaluations regularly to ensure data security, privacy and integrity.
Monitor database performance, implement changes and work with the DevOps team to apply new patches and versions when required.
Build database systems of high availability and quality depending on each end user’s specialised role and business value.
Build algorithms and prototypes.
Optimization of data retrieval.
Explore and advocate ways to enhance data quality and reliability.
Design implement and maintain different reporting solutions.
Analytics
Evaluate business needs and objectives by analyzing and organize raw data.
Identify opportunities for data acquisition.
Prepare data for prescriptive and predictive modeling.
Design and drive implementation of solutions to provide easier access to raw data.
Assemble large, complex data sets that meet functional / non-functional business requirements.
Conduct complex data analysis and report on results.
Build analytics tools that utilize the data pipeline to provide actionable insights into payment terminals, operational efficiency and other key business performance metrics.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using ‘big data’ technologies.
Support / Collaboration / Mentorship
Work with stakeholders including the Development, Executive, Product, DevOps and Architect teams to assist with data-related technical issues and support their data infrastructure needs.
Work with stakeholders to strive for greater functionality in our data systems.
Mentor and train other team members.
Assist where necessary with knowledge transfer to other technical Invenco employees, particularly around data technologies.
Strategy and Design
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Design build and maintain different data stores based on different use cases.
Ensure the design and quality of our products are kept up to date with industry standards, particularly around security and compliance such as PCI DSS and Common Criteria.
Research new technologies when appropriate to ensure the ongoing strategy of DevOps within Invenco is heading in the right direction.
Lead by example with new technologies.
What We Offer
Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them.
Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities!
Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays.
Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings.
Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses.
Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
About GlobalLogic GlobalLogic is a leader in digital engineering. We help brands across the globe design and build innovative products, platforms, and digital experiences for the modern world. By integrating experience design, complex engineering, and data expertise—we help our clients imagine what’s possible, and accelerate their transition into tomorrow’s digital businesses. Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers around the world, extending our deep expertise to customers in the automotive, communications, financial services, healthcare and life sciences, manufacturing, media and entertainment, semiconductor, and technology industries. GlobalLogic is a Hitachi Group Company operating under Hitachi, Ltd. (TSE: 6501) which contributes to a sustainable society with a higher quality of life by driving innovation through data and technology as the Social Innovation Business.