Job Description
📍 Location: Pune, India (Hybrid)
💼 Job Type: Full-Time
🆔 Job Req ID: 25892286
🌟 About Citi
At Citi, we are more than a global financial services company; we are a community of over 230,000 professionals working across the world to deliver innovative solutions, give back to communities, and make a real impact. Citi offers a dynamic environment to grow your career and contribute to projects that shape the future of banking.
🖥️ Job Overview
The Senior Developer – Java / Spark / PySpark role is part of the Digital Software Engineering team. You will be responsible for developing and maintaining scalable, high-performance applications within the Big Data ecosystem. The role involves working on Spark, Hadoop, and related technologies, and requires strong programming expertise in Java and/or Python.
You will independently solve complex problems, integrate in-depth knowledge with industry standards, and provide analytical insights to support business-critical decisions.
🔑 Responsibilities
-
Design, develop, and implement robust Big Data solutions using Spark and PySpark.
-
Write clean, secure, testable, and maintainable code for distributed, cloud-based applications.
-
Work extensively with Hadoop, Hive, SQL, and Unix shell scripting.
-
Participate in CI/CD processes using Jenkins, Git, RLM, and other tools.
-
Collaborate with global teams to understand system architecture, dependencies, and runtime operations.
-
Develop and maintain integrated project schedules accounting for dependencies and risks.
-
Optimize application performance, automate code quality checks (SonarQube, FindBugs), and implement unit testing.
-
Provide technical guidance on system build, performance, and design discussions.
-
Ensure compliance with firm policies, regulations, and ethical standards while assessing technical risks.
✅ Required Skills & Qualifications
-
Experience: 4–7 years in application development; 2–3 years in Big Data ecosystem as a developer.
-
Programming: Expert in Java and/or Python.
-
Big Data Technologies: Spark, PySpark, Hadoop, Hive.
-
Database & SQL: Expert-level knowledge.
-
Other Skills: Unix shell scripting (intermediate), CI/CD tools (Jenkins, Git, RLM, Lightspeed), Jupyter Notebook (good to have).
-
Strong analytical, quantitative, and problem-solving skills.
-
Ability to work independently and collaboratively with global teams.
-
Experience delivering in Agile methodology.
🎓 Education
-
Bachelor’s degree or equivalent experience in Computer Science, Engineering, or related field.
💡 Preferred Qualifications
-
Banking domain knowledge is a plus.
-
Exposure to running high-traffic distributed cloud services.
-
Experience leading infrastructure programs and working with third-party service providers.
🚨 Before You Apply: Your Resume Needs to Shine!
Did you know? 75% of applications get rejected before reaching a human recruiter – all because of poorly formatted resumes that fail ATS scans!
🔥 Get Interview-Ready in Minutes with Our Professionally Designed Resume Templates!
✅ 5+ ATS-Friendly Designs – Beat the bots and get noticed
✅ Recruiter-Approved Layouts – Highlight your skills the right way
✅ Easy-to-Edit (Word & Google Docs) – No design skills needed
✅ Free Bonus: Cover Letter Template + Resume Writing Guide
🎁 Limited-Time Offer: Get yours for just ₹249 (originally ₹999)
📥 Instant Download – Apply to Google with confidence today!
👉 Grab Your Resume Template Now: Tap Here to get your resume Templates