9738882432/[email protected]/aditya-anand
- Redesigned the entire reporting module of my product, reducing latency in crucial APIs from ~10s to <1s by utilizing RDS Replica DB, Decreasing load in the DB and reducing cost by 50% and making the product more resilient.*
- Redesigned the Settlement Flow using AWS SQS, Lambda & SFTP, for the finance team reducing their operation time from ~2 hours to <5 min.
- Implemented Redis using AWS Elasticache at middleware layer reducing data fetch latency from milliseconds to microseconds resulting in faster API response time & reduced DB load.
- Moved middleware operations from code to Nginx layer using Lua resulting in faster API performance and SRP at an infrastructure level.
- Engineered APIs in collaboration with banks enabling merchants & billers to Analyze Payment Failure Incidents for UPI Transactions resulting in higher payment success rates.
- Optimized database performance by managing Indexes, Procedures and DB queries reducing latency DB CPU usage.
- Engineered performance benchmark framework in Java to be utilized by multiple development teams for their API benchmarking.
- Developed a Payment Product based on top of UPI Stack enabling users to make payments from multiple payment methods like UPI, Debit Card, Credit Card, UPI Credit Cards, App specific reward points etc.
- Developed NPCI services to mimic the BBPS ecosystem enabling the team to complete product certification and Go Live with minimal defects in less than 6 months.
- Developed Whatsapp Service to mimic whatsapp to be used by test teams to Test Whatsapp bot of the Lending Team.
- Developed Python tool for Test Case migration from HP QC to Octane.
- Created Python tool for API analytics using HAR files.
- Found framework level defects in system behavior which improved the software with respect to business standards.
- Developed Python tool for Test Case migration from HP QC to Octane.
- Collaborated in the BDD process by writing Feature files and step definitions in Java.
- Developed script to report multiple spam/phish URLs at once and update the user if the URL is already reported.
- Created a script to generate an automated report on the statistics of the actions performed during a span of one week.
- Used SIEM tools to analyze bulk data in order to identify spam/phish campaigns and take action on them. Created automated reports which filter out potential malicious users/domains used to spamming users