Rocky HIll CT, US
Phone: xxx-xxx-xxxx
Email: xxx@xxxx.xxx
Looking For: data architect, devops & cloud engineer
Occupation: IT and Math
Degree: Bachelor's Degree
Career Level: Fully Competent
Languages: English
Highlights:The first phase of systems development is in progress with the migration of the legacy billing system to Guidewire Billing Center, achieving significant milestones in improving processing speed and data accuracy. • Implemented CDA Connector-based Glue ETL for seamless multi-system file integration, enhancing data retrieval and accuracy. • Developed custom DBT orchestration solution on EKS, leveraging GitOps with CBJ Docker builds and Argo CD for efficient workload placement. • Enhanced observability for Micro Batch ELT Jobs using Elasticsearch/CloudWatch monitoring. • Designed and deployed an highly available Kubernetes cluster with GitOps workflow, reducing deployment times by 80% and achieving 99.99% uptime. • Implemented scalable, dockerized Elasticsearch cluster for log analytics, reducing MTTR for incidents by 60%. • Implemented a scalable, dockerized Elasticsearch cluster for log analytics, improving system-wide monitoring capabilities and reducing mean time to resolution (MTTR) for incidents by 60%. • Architecture a comprehensive data governance framework using Alation Data Catalog and AWS Glue Catalog, ensuring regulatory compliance and improving data discovery.• Developed Talend Cloud EC2-based data pipeline, significantly enhancing data processing and accuracy. • Streamlined ADP Talend ETL Jenkins pipelines using Groovy scripts, improving data quality and reducing manual workload. • Designed self-service portal security with Okta, AWS IAM, and Hashi Corp Vault, enhancing data protection and efficiency. • Implemented cost-effective infrastructure management strategies, lowering TCO for internal cloud systems. • Successfully deployed Carpe Data Snowflake warehouse system, enhancing data management capabilities and achieving cost savings through data reuse.
Skills:Cloud Platforms: AWS, Azure, GCP, Virtualization: Proxmox, VMware, Containerization: Docker, Kubernetes, LXC/LXD, AI/ML:Ollama,OpenAI,Py Torch, Tensorflow,RAG, Big Data: Apache Spark, Databricks, Hadoop, Data Processing: dbt, Py Spark, Snowflake, Databases: DynamoDB,MySQL,MongoDB,Postgresql, DevOps:Jenkins,GitLab,Argo CD,Helm,Terraform, ETL Tools: AWS Glue, Informatica, Talend, Networking:VLANs,OPNsense,TailScale,Wire Guard, Security: IAM, SIEM, Zero Trust Architecture, Monitoring: ELK Stack, Grafana, Prometheus, Project Management: Agile methodologies,Jira,Rally, Self-host:CloudFlareDNS,Kibana,Logstash,Next Cloud, ProgrammingLanguages: C/C++,Rust,Golang,Scala,Java, Frameworks:Angular,AWS SDK (Boto3), Azure SDK, Big Data & ML: Apache Spark (Py/scala Spark), PythonTool: NumPy, Pandas,Py Torch,FastApi,OPenAI, Web Development: Reactjs, Angular
Goal:Data & Applications Architect Innovative Data & Applications Architect with 17+ years of experience in Data Warehousing, Business Intelligence, and Cloud Technologies. Proven ability in AWS, Azure, and GCP, with a track record of successfully implementing end-to-end solutions for Fortune 500 companies. Specialized in Big Data processing using Apache Spark, Databricks, and EMR, consistently delivering projects 20% under budget and 15% ahead of schedule. Led a team that increased data processing efficiency by 40% for a major financial institution using custom ETL solutions. Skilled in defining robust monitoring strategies and architecting resilient systems, resulting in 99.99% up time for critical applications. Recognized thought leader in emerging technologies, with multiple publications on AI-driven data architectures and cloud- native solutions.
Certification:Data Bricks • 01/2024 - 01/2024 Data Bricks AWS Solution Architecture • 05/2021 - 12/2022 Pluralsight Talend Cloud • 04/2021 - 04/2021 Talend Cloud Sales Support
Cloud Architect 09/20 - 08/21
LTIMindtree, Hartford, CT United States
Industry: Software
TrvCloudEx Automation Self-Service Portal
Client: Travelers
Scope: Multi-team collaboration over 12 months, involving a $1M budget and integration with
multiple enterprise security systems.
Role: Cloud Architect
Key Achievements:
Developed a platform for efficient cloud resource provisioning and deprovisioning.
Integrated with enterprise security stacks for improved risk management.
Implemented IaC framework for state management in S3 and DynamoDB.Measurable Results:
Significant reduction in setup times for various cloud instances. Enhanced security and
reduced risk through comprehensive integration.
Improved operational efficiency and lowered TCO for internal cloud systems.--
Cloud Architect 08/21 - 01/23
LTIMindtree, Hartford, CT United States
Industry: Software
DataOps- Pipeline as Service Automation Suite
Client: Travelers
Scope: Development of a self-service portal for infrastructure automation, serving various
data-related personas within Travelers Insurance.
Role: Cloud Architect
Key Achievements:
Developed a platform for provisioning and deprovisioning various cloud instances.
Integrated with enterprise security stacks (Okta AD, Azure AD, AWS IAM).
Implemented IaC framework using Terraform for state management.
Handled TDM and data security with enforced patterned data eTravelers
Scope: Development of a self-service portal for infrastructure automation, serving various
data-related personas within Travelers Insurance.
Role: Cloud Architect
Key Achievements:
Developed a platform for provisioning and deprovisioning various cloud instances.
Integrated with enterprise security stacks (Okta AD, Azure AD, AWS IAM).
Implemented IaC framework using Terraform for state management.
Handled TDM and data security with enforced patterned data encryption masking. Feedback
Received:
Positive stakeholder feedback on improved cloud resource provisioning efficiency.
Commendations for enhanced security posture and reduced operational risks.
Measurable Results:
30% improvement in operational efficiency.
40% reduction in infrastructure setup times.
20% lower TCO for internal cloud systems.ncryption masking. Feedback
Received:
Positive stakeholder feedback on improved cloud resource provisioning efficiency.
Commendations for enhanced security posture and reduced operational risks.
Measurable Results:
30% improvement in operational efficiency.
40% reduction in infrastructure setup times.
20% lower TCO for internal cloud systems.--
Data Architect 03/20 - 09/20
LTIMindtree, Hartford, CT United States
Industry: Software
Snowflake Data warehouse system build forCarpe 3rd Party vendor data for busines systems Integration
Client: Travelers
Scope: Development of a warehouse system for monthly batch data integration from Carpe
vendor for underwriters and quotes system business needs.
Role: Data Architect
Key Contributions:
Designed and implemented Carpe Data Snowflake warehouse system.
Developed security architecture using Azure AD & AWS IAM.
Created stored procedures for database schema, table creation, and data processing.
Built DataOps automation pipelines for monthly batch data processing.
Developed Python-based test automation suite. Measurable Results:
Successfully implemented production-ready cloud-based warehouse system.
Enabled efficient data access for analysts and data scientists.
Ensured system met SLAs and scalability requirements through performance testing.--
Data Architect 02/2023 - 08/2024
LTIMindtree, Hartford, CT United States
Industry: Software
CorpTech-Payment & Billing Systems- Downstream Billing Data Warehouse
Client: Travelers
Scope: Migration of 40-year-old ABS mainframe billing systems to Guidewire Billing Center
SaaS cloud solution, involving data conversion and history data migration.
Role: Data ArchitectKey Contributions:
Designed comprehensive ETL/ELT data pipelines for Billing Systems data.
Implemented Glue ETL, Guidewire CDA Connector, and Py Spark for efficient data
processing. Developed DBT ELT processes for Snowflake Base Layer and Semantic
Layer.
Orchestrated EKS workloads with GitOps, Docker, and Argo CD.
Implemented Elasticsearch/CloudWatch for enhanced observability and monitoring.
Measurable Results:
Improved data accessibility through business table flattening.
Validated system SLAs and scalability for state and business table data loads.
Streamlined knowledge transfer processes, reducing project timelines.--
JNTU Hyderabad 04/2020 - 05/2024
Hyderabad, Andhrapradesh, India
Degree: Bachelor's Degree
Major:Electrical and Electronics Engineering
Bachelor of Technology in Electrical and Electronics Engineering, Jawaharlal Nehru Technological University, Hyderabad (04/2000 - 05/2004). Renowned for academic excellence, the program offered a comprehensive curriculum in power systems, control systems, electronics, and communication. It provided state-of-the-art labs, modern teaching facilities, and emphasized practical exposure, creativity, critical thinking, and problem-solving, preparing graduates for diverse engineering careers.
Login to view resume: Data & Applications Architect - Data Architect, Application Architect, Cloud SRE