Computer Systems Engineer
Qualified candidate will have the following roles and responsibilities:
- Analyze the Business/Functional requirements and develop them into technical aspects.
- Collaborate and actively coordinate product and business team requirements and work with a multidisciplinary development team.
- Perform Software development life cycle from design, architecture, coding and development, troubleshooting the quality analysis and debugging.
- Verify data processing, data engineering, data analysis, data visualization and web development using Spark, Scala, Python, and other languages.
- Develop and manage data pipelines to ingest, cleanse, and transform data from diverse sources, utilizing tools like Apache Nifi and AWS Glue.
- Implement and optimize data processing workflows using technologies such as Apache Spark, Hadoop, and cloud-based solutions like AWS EMR.
- Design and maintain data warehouses, including cloud data warehousing platforms like Amazon Redshift or Azure Synapse Analytics.
- Create and manage data models and schemas for structured data, ensuring efficient query performance using tools like ERWin or Apache Avro.
- Build ETL processes for data extraction, transformation, and loading, by using AWS Glue and Azure Data Factory.
- Manage Hadoop clusters or leverage cloud infrastructure (AWS, Azure) for scalable and cost-effective data processing and storage.
- Maintain version control using tools like Git and document data engineering processes, workflows, and infrastructure configurations.
- Enforce data quality checks and governance policies to maintain data accuracy, consistency, and compliance with industry standards.
- Develop real-time data processing solutions using Apache Kafka or cloud-based streaming services to enable immediate data insights.
- Involve in industrial best practices like code reviews on regular basis and work on various tools.
Knowledge, Skills & Experience:
- Proficient in implementing data processes and tools such as Spark, Scala, and Python optimizations, Apache Nifi, WS Glue, Amazon
- Redshift or Azure Synapse Analytics, Azure Data Factory, ERWin or Apache Avro, Hadoop and Kafka.
- Experience in different application development methodologies including Agile, SCRUM etc.
- Experience or knowledge in data encryption, access controls, and data masking to ensure data security and regulatory compliance (e.g., GDPR, HIPAA).
- Proven experience in a similar role with a focus on security systems administration.
Education:
Bachelor's Degree in Computer Science or closely related field.
Please reach out to our recruitment team for latest
openings. hr@itspurs.com