Required Skills: - 3 + Years Of an experience in Big Data Admin. - Should have hands-on Experience in Big Data Stack ,Apache Spark ,Azure Cloud: HDInsight, Fabric, ADF, Azure Compute, Storage, Networking Troubleshooting Skills
Dear Candidate, Greetings from TATA Consultancy Services!! Thank you for expressing your interest in exploring a career possibility with the TCS Family. Hiring For:- Big Data with Pyspark/Spark, Scala Location: Mumbai - Olympus Experience: 5+years Required Technical Skill Set Pyspark, Hive, Hbase Must-Have Spark, Pyspark, Hive, Hbase, Kafka Good-to-Have Agile SN Responsibility of /...
We’re Hiring: Informatica BDM (Big Data) Developer Joining: Immediate / February joiners preferred 📍 Location: Mumbai (Andheri) 💼 Experience: 4+ years 💰 CTC: Up to ₹16 LPA 🔎 What We’re Looking For We’re looking for a Big Data Developer who can work with large-scale datasets, build robust data pipelines, and support risk-related data processing for business-critical systems. 🛠 Key Skills...
Role Overview: We are seeking a highly skilled Data Engineer - GCP with 6–8 years of experience in designing, developing, and managing enterprise data solutions on Google Cloud Platform (GCP). The ideal candidate will have a strong background in cloud data architecture, data warehousing, big data processing, and data integration, with proven expertise in delivering scalable, secure, and...
Job Title: Data Engineer (Data Warehouse Focus) Domain: BFSI Experience: 5-8 Years Location: Remote Employment Type: Full-Time Job Summary Seeking a Data Engineer with strong expertise in Snowflake, Python, Airflow, and SQL to design, build, and maintain scalable cloud-based data warehouse solutions for BFSI platforms. Key Responsibilities - Design and develop data warehouse architectures...
Position Requirements: ● Minimum 4+ years of experience on PostgreSQL or any Unix based relational database. Advanced understanding of SQL. ● Minimum 2+ years’ experience with Pentaho Data Integration Tools or any ETL/ELT tools like Talend, Informatica, or HOP. ● 2+ years of hands-on experience with orchestration tools (Airflow, Control-M, Autosys). ● Hands-on experience in Python scripting...
experience in administration of Hadoop Big Data tools • experience working on batch processing and tools in the Hadoop technical stack (e.g., MapReduce, Yarn, Hive, HDFS, Oozie) • The candidate must have experience in Ambari setup and management • 1 to 2 years of MapRcluster management/administration · • 2+ years of administration experience working with tools in the stream processing technical...
Responsibilities - Design, develop, and implement robust Big Data solutions using technologies such as Hadoop, Spark, and NoSQL databases. - Build and maintain scalable data pipelines for effective data ingestion, transformation, and analysis. - Collaborate with data scientists, analysts, and cross-functional teams to understand business requirements and translate them into technical...
At Nielsen, we believe that career growth is a partnership. You ultimately own, fuel and set the journey. By joining our team of nearly 14,000 associates, you will become part of a community that will help you to succeed. We champion you because when you succeed, we do too. Embark on a new initiative, explore a fresh approach, and take license to think big, so we can all continuously improve. We...
Description : 4 or more years of experience working directly with enterprise data solutions Hands on experience working in a public cloud environment and on-prem infrastructure. Specialty on Columnar Databases like Redshift Spectrum and the AWS cloud infrastructure services (Redshift, S3, Lambda) Excellent SQL skills and Python coding is a must Experience with a wide variety of modern data...
Title – Senior Data Engineer Position – Contract with Opportunity for Full Time Conversion Location – Remote Work Salary – 1-2 Lakhs (Monthly) Working hours should be 6:30 am - 2:30 pm EST (USA) Qualifications - Bachelor’s or Master degree in Computer Science, Information Systems, or related field. - 6+ years experience architecting and building large scale data platforms - Experience...
Job Summary: We are seeking a skilled Data Quality Engineer to ensure the accuracy, reliability, and integrity of our data pipelines and workflows. The ideal candidate will have hands-on experience in data engineering concepts, with a strong focus on quality testing, validation, and pipeline orchestration. Key Responsibilities: Design, develop, and execute data quality test cases to validate...
About Statusneo We accelerate your business transformation by leveraging best fit CLOUD NATIVE technologies wherever feasible. We are DIGITAL consultants who partner with you to solve & deliver. We are experts in CLOUD NATIVE TECHNOLOGY CONSULTING & SOLUTIONS. We build, maintain & monitor highly scalable, modular applications that leverage elastic compute, storage and network of leading cloud...
📍 Location: Dubai, UAE (Hybrid) Remote considered for exceptional profiles ⏳ Duration: 4 Months (Internship) → Full-Time Conversion Opportunity About Sekuen & SignalX Sekuen builds AI-driven products that help companies scale revenue intelligently. Our product SignalX targets hard-to-find B2B prospects using 300M+ business profiles, enabling modern, automated, and agentic sales...
Title: Azure Data Engineer (Databricks) Location: 100% REMOTE (from India) Shift Timings: 4:00pm IST to 1:00am IST Pay: Negotiable (depending on experience) Start Date: ASAP Job Description: We are seeking a highly skilled Azure Data Engineer with deep expertise in the Azure cloud ecosystem to design, build, and maintain scalable data integration and analytics solutions. The ideal candidate...
Job Title: Senior / Lead Data Engineer – Healthcare Analytics (GCP) Experience: 8–10+ Years Location: Remote Time: 5:30 PM IST to 2:30 AM IST Domain: Healthcare Data & Analytics Primary Cloud: Google Cloud Platform (GCP) Job Summary We are seeking a highly experienced Senior / Lead Data Engineer with deep hands-on expertise in building scalable data platforms on Google Cloud Platform...
• Cloud Data Analytics (AWS, Oracle, Azure, Snowflake) – Hands-on solution design experience in at least two of cloud platforms. • Experience in growing & mining existing accounts • Experience in implementing at least 4 Cloud Data Analytics & Big Data engagements. • Extensive experience in solution design in complex projects in big data, cloud analytics, BI/DW, and data management. • Extensive...
• The person hired will provide administration and platform support to our Development teams as new Cloud applications are deployed. The person would ensure Code and access to data is optimized, minimizing our monthly AWS chargeback costs • This position is responsible for having a high level of Knowledge and extensive hands-on experience designing, structuring, optimizing, implementing, and...
Responsibilities: Working in a challenging, fast-paced environment to create a meaningful impact on your work Identify business problems & use data analysis to find answers Code and maintain data platform & reporting analytics Design, build and maintain data pipelines. Qualifications Desire to collaborate with a smart, supportive engineering team Strong passion for data and willingness to learn...
Candidate should be able to: Work closely with our QA Team to ensure data integrity and overall system quality Work closely with Technology Leadership, Product Managers, and Reporting Team for understanding the functional and system requirements Write Shell/Python scripts for jobs scheduling and data wrangling Write Scoop Jobs to Import/Export data from Hadoop Enhance existing Spark and Java...