Summary
Overview
Work History
Education
Skills
Project Profiles
Certification
Timeline
Generic
ANTARA HUDDAR

ANTARA HUDDAR

Data Engineer
Nagpur

Summary

Skilled and knowledgeable ETL and Snowflake/Databricks developer with demonstrated experience of more than 5+ years working in the Retail and Power domain. A proven professional in SQL, Snowflake, ETL tools, Databricks, Shell Scripting, AWS services, and providing analytical and problem-solving solutions for data mining and data integration. In the lifecycle of the project, IBM's ETL tool DataStage was used to develop ETL pipelines after analyzing requirements, developing proof-of-concepts, designing, developing, and unit testing. Have experience optimizing SQL scripts and aggregated table solutions. Experience and understanding of AWS services like S3, EC2, Kinesis, Secret Manager along with cloud data warehouse Snowflake.

Overview

6
6
years of professional experience
4
4
years of post-secondary education
3
3
Certifications

Work History

Data Engineer

InfoCepts
Nagpur
07.2018 - Current

Summary:

  • Migrated legacy systems to cloud technologies, improving performance and scalability while minimizing business disruption
  • Collaborated with cross-functional teams for seamless integration of data sources into the company''s data ecosystem
  • Managed cloud-based infrastructure to ensure optimal performance, security, and cost-efficiency of the company''s data platform
  • Optimized data processing by implementing efficient ETL pipelines and streamlining database design
  • Collaborated on ETL (Extract, Transform, Load) tasks, maintaining data integrity and verifying pipeline stability
  • Prepared written summaries to accompany results and maintain documentation
  • Contributed to internal activities for overall process improvements, efficiencies and innovation
  • Provided technical guidance and mentorship to junior team members, fostering a collaborative learning environment within the organization

Education

Bachelor of Engineering - Information Technology

St. Vincent Pallotti College of Engineering &Tech
India
07.2014 - 05.2018

Skills

    Snowflake

undefined

Project Profiles

1) Luxury retail Client

  • Conducted analysis for a cloud migration project focusing on database selection, leveraging practical experience in AWS architecture and technologies such as EC2, S3, and Secret Manager.
  • Assisted in the conversion of 2500 SQL scripts into Snowflake database format using a Python script, ensuring seamless data migration.
  • Facilitated 20 code catch-up sessions during ongoing deployments to ensure smooth transition and minimize disruptions.
  • Analyzed report specifications and implementation complexities to provide insights and recommendations for efficient execution.
  • Conducted rigorous unit testing of DataStage jobs on new EC2 machines to validate functionality and performance.
  • Contributed to the development of Unix scripts to facilitate file transfer between environments and AWS, optimizing data transfer processes.
  • Demonstrated proficiency in ingesting real-time data for immediate reporting needs, enhancing decision-making processes with up-to-date insights.
  • Engineered ETL (Extract, Transform, Load) pipelines tailored to efficiently ingest semi-structured data formats such as JSON and XML into data warehousing systems, aligning with specific data warehousing requirements and standards.
  • Engaged in driving calls with Business Analysts for requirement gathering, ensuring alignment between technical solutions and business needs.
  • Led calls with Business stakeholders to accelerate the User Acceptance Testing (UAT) process, actively working to resolve issues and bring closure to the UAT phase.
  • Provided technical guidance and support to the team, leveraging expertise in cloud migration, AWS technologies, and scripting languages.
  • Implemented Continuous Integration and Continuous Deployment (CI/CD) pipeline in the client environment, ensuring efficient and automated deployment processes.
  • Utilized Bitbucket as a versioning tool, managing code repositories and enabling collaboration among team members.
  • Configured Jenkins pipelines to facilitate the deployment of code artifacts to various client environments, enhancing efficiency and reducing manual effort.
  • Facilitated ETL code synchronization from the client repository to the versioning tool, ensuring consistency and version control across environments.
  • Improved Jenkins pipeline code to automate the deployment of objects into client environments, streamlining the deployment process and minimizing errors.
  • Developed Standard Operating Procedure (SOP) documents to provide clear guidance for new team members, facilitating their onboarding and ensuring adherence to best practices.
  • Conducted training sessions for developers and clients to impart understanding of the CI/CD process, promoting collaboration and knowledge sharing within the team and with stakeholders.


2) Power Producer Client

  • Created aggregated views tailored for cryptocurrency mining data, enabling comprehensive analysis and insights into mining operations.
  • Developed functions and stored procedures to address specific business requirements, ensuring efficient data processing and retrieval within the database environment.
  • Conducted rigorous unit testing and data validation procedures to ensure the accuracy and integrity of data throughout the development process, maintaining high-quality standards and reliability in the delivered solutions.


3) Pet Retail Client

  • Executed the repointing of views from Netezza to Snowflake as part of the Netezza decommission initiative, ensuring seamless transition and continuity of data access.
  • Refactored Snowflake queries originally implemented in the Denodo tool, facilitating the ultimate decommissioning of Denodo and optimizing query performance for Snowflake.
  • Assisted the team/lead in the deployment process of views from development (DEV) to production (PROD) environments, ensuring smooth and efficient transitions while maintaining data integrity and security protocols.
  • Transitioned into the Databricks environment, bringing expertise and experience to leverage its capabilities for advanced data analytics and processing.
  • Collaborated with teams to design and implement efficient data pipelines, utilizing Databricks' powerful tools and functionalities to streamline data workflows and optimize performance.
  • Actively participated in knowledge sharing and training sessions to disseminate best practices and foster proficiency among team members in utilizing Databricks effectively.
  • Contributed insights and recommendations for utilizing Databricks to address specific business challenges and opportunities, demonstrating a deep understanding of both technical and business requirements.


4) Company Initiative 

  • Successfully tackled a designated problem statement by architecting a solution leveraging various AWS services.
  • Engineered a custom codebase capable of ingesting real-time data streams via Amazon Kinesis, seamlessly interfacing with an ETL tool for comprehensive profiling.
  • Utilized Amazon Redshift as the primary database solution for storing the streaming data in its raw format, ensuring scalability and performance.
  • Configured CloudWatch Logs to meticulously capture logs generated by all services employed within the project ecosystem, facilitating comprehensive monitoring and troubleshooting.
  • Leveraged Amazon QuickSight to design and implement a dynamic dashboard tailored to the specific requirements dictated by real-time data insights, empowering stakeholders with actionable visualizations.

Certification

Databricks Certified Data Engineer Associate

Timeline

Databricks Certified Data Engineer Associate

03-2024

Dbt Fundamentals Badge

06-2023

Hands-On Essentials: Data Warehousing Workshop Badge

11-2021

Data Engineer

InfoCepts
07.2018 - Current

Bachelor of Engineering - Information Technology

St. Vincent Pallotti College of Engineering &Tech
07.2014 - 05.2018
ANTARA HUDDARData Engineer