Raghav Khanna

+91 9548826365 | itsraghavkhanna@gmail.com | LinkedIn | Website | GitHub


Professional Summary


Highly skilled software developer with over 3 years of experience in building scalable Java and Springbased applications. Demonstrated ability to deliver high-quality solutions, complemented by excellent communication skills honed through client-facing roles, project experiences, and leading technical events.
Proficient in Java Backend Development and Cloud Engineering, with additional Frontend experience. Certified AWS Solutions Architect with in-depth knowledge of cloud architecture (AWS), microservice architecture and related stacks (Service Registry, Splunk, Microservices, VMs), and container technologies (Docker, Kubernetes).
Committed to self-improvement, I consistently seek opportunities to enhance my skills and knowledge, aspiring to excel as a technical leader. During my tenure at Deloitte spanning over 3 years, I have gained exposure to a diverse array of technologies including Backend, API development, Kafka, AWS, Docker, and Cloud Computing.
Experienced in Agile methodologies and DevOps CI/CD development environments/tools such as Git, GitLab, Bitbucket, Maven, Jenkins, Bamboo, Rancher, Docker, Kubernetes, Terraform, Splunk, and Lenses.
As a passionate learner and adept problem solver, I am enthusiastic about contributing to projects that generate positive impacts


Technical Skills



Experience


Deloitte | August 2021 – Present

Project: Lululemon Athletica Inc. - Canada

Role: Backend Developer

Industry: Retail Sector

Tools and Technology: Java, Spring Boot, Spring Security, Envelop Encryption, Spring Scheduler, Kafka, Apigee, Splunk, Datadog, Terraform, Gitlab, Vault, Maven, Jenkins, Docker

Responsibilities:

  • Developed Order Purchased History and Details API for user and Educators to help them tracking order status and view purchased order details. The application directly connects to DynamoDB for fetching out the user details.
  • Enhanced the existing Create Order Consumer API and Order Updates API to get transaction updates of orders from Commerce tools using Kafka consumers and update order details in DynamoDB.
  • Developed Lambda application to get all Dynamo transaction details (live transaction feed) using trigger and publish them into Kafka topics with PII data encrypted using Envelop encryption for a consumer to generate report based on User order history (EBR feed).
  • Developed a Lambda application, which triggers whenever a new transaction is recorded in DynamoDB, leveraging the DynamoDB Streams API and created another service compares old and new images from the streams, extracting the updated data, and publish it into Kafka topics with PII data encrypted using Envelop encryption.
  • Used Splunk to create alerts and delivered multiple dashboards for each application to keep track of exceptions, errors, RPMs, live user count and other analysis.
  • Successfully delivered RED dashboard which keeps visualizes RPM, Performance, TAT, SLI and SLAs of each application including performances of consumer and producer apps.
  • Trained 3 analysts in Java, AWS, and Splunk, facilitating their successful onboarding onto a client project which resulted in increased business.
  • Organized Splunk training at the organizational level in coordination with the learning and development team.

Project: Public Pension Agency (Government of Saudi Arabia) - Saudi Arabia

Role: Full-Stack Developer

Industry: Finance/Public Sector

Tools and Technology: Java, Spring Boot, React, JavaScript, Bootstrap, HTML, CSS

Responsibilities:

  • Created a business rule engine (BRE) providing rules/queries to automate the business flow.
  • Developed backend service using Spring Boot, interacting with Oracle DB via JPA repository.
  • Built the frontend using Bootstrap and integrated the application with the backend.
  • Worked on Informatica to create data mappings as per the provided data model.
  • Created custom Tableau queries for dashboards as required.
  • Mentored interns on Java and Spring Boot to develop a complete application.

Project: McCain Foods Ltd. (Multinational Frozen Food Company), Canada

Role: Data Transformation and Data Refining

Industry: Retail and Frozen Food Sector

Tools and Technology: Data Readiness, Data Cleansing, Salesforce Data Loader, Azure Logic Apps

Responsibilities:

  • Involved in data cleansing and data readiness activities while ensuring accuracy, completeness, consistency, and validating other provided data.
  • Created basic Azure Logic Apps to fetch and dump data into the target system.
  • Developed relationships between data objects and real-time entities to create fields in Salesforce.
  • Uploaded data into Salesforce using the Salesforce Data Loader tool.

I-Smile Technologies | Cloud Engineer | Sept – Nov 22 | Internship

Role: Cloud Engineering Intern

Responsibilities:

  • Deployed applications via AWS and Azure with CI/CD pipelines.
  • Provisioned infrastructure and virtual machines for compute and general requirements.
  • Built a Virtual Assistant using IBM Cloud Watson Assistant API and integrated it with a WordPress website.

Niki.ai | Operation Analytics | June – Sept 22 | Internship

Role: Operation Analytics Intern

Responsibilities:

  • Majorly involved with Operations Team at Niki for various analytics-related tasks and requirements.
  • Worked in diverse tools Excel, SQL, DynamoDB, etc., and Handling Database (Hosted on AWS).

Certifications



Achievements



Education


Bachelor of Engineering and Technology in Computer Science and Engineering

Institute: Meerut institute of engineering and technology (07/2017 – 08/2021) (SGPA – 8.4)


Activities and Interests