About Us:
Paytm is India’s leading digital payments and financial services company, which is focused on driving consumers and merchants to its platform by offering them a variety of payment use cases. Paytm provides consumers with services like utility payments and money transfers, while empowering them to pay via Paytm Payment Instruments (PPI) like Paytm Wallet, Paytm UPI, Paytm Payments Bank Netbanking, Paytm FASTag and Paytm Postpaid - Buy Now, Pay Later. To merchants, Paytm offers acquiring devices like Soundbox, EDC, QR and Payment Gateway where payment aggregation is done through PPI and also other banks’ financial instruments. To further enhance merchants’ business, Paytm offers merchants commerce services through advertising and Paytm Mini app store. Operating on this platform leverage, the company then offers credit services such as merchant loans, personal loans and BNPL, sourced by its financial partners.
Job Summary:We are seeking a skilled Data Engineer to join our dynamic team. The ideal candidate will have a solid background in data engineering role; we are seeking an individual with 2 to 5 years of experience in software development and architecture, with a strong background in cloud technologies, microservices, containerization, and CI/CD pipelines. Experience with Agile methodologies, strong communication skills, and the ability to collaborate effectively within a team are also essential for this role.
Key Responsibilities:Design, develop, and maintain scalable data pipelines to support data ingestion, processing, and storageImplement data integration solutions to consolidate data from multiple sources into a centralized data warehouse or data lakeCollaborate with data scientists and analysts to understand data requirements and translate them into technical specificationsEnsure data quality and integrity by implementing robust data validation and cleansing processesOptimize data pipelines for performance, scalability, and reliabilityDevelop and maintain ETL (Extract, Transform, Load) processes using tools such as Apache Spark, Apache NiFi, or similar technologiesMonitor and troubleshoot data pipeline issues, ensuring timely resolution and minimal downtimeImplement best practices for data management, security, and complianceDocument data engineering processes, workflows, and technical specificationsStay up-to-date with industry trends and emerging technologies in data engineering and big data
Mandatory Skills and Qualifications:Bachelor’s degree in Computer Science, Engineering, or a related field2 to 5 years of experience in data engineering or a similar roleCloud Platforms: AWS Proficiency (EMR, EC2, S3, dynamoDB, Lambda, EBS, RDS, Cloudwatch, cloudtrail)Spark: Advanced knowledge of Apache Spark for large-scale data processingProgramming Languages: Proficiency in Python, Scala and SQLMisc: Airflow, kafka streaming, rest APIs, bitbucket repos, CI/CD pipelines, MongoDB, sharded DBs, master slave architectures, parquet filesETL Tools: Experience with ETL tools and frameworksSoft Skills: Excellent problem-solving skills, attention to detail, and the ability to work collaboratively in a team environment
Compensation:
If you are the right fit, we believe in creating wealth for you with enviable 500 mn+ registered users, 25 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it.
India’s largest digital lending story is brewing here.
It’s your opportunity to be a part of the story!