Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Indore (Madhya Pradesh) |
Education | Not Mentioned |
Salary | Not mentioned |
Industry | Not mentioned |
Functional Area | 1 |
EmploymentType | Full-time |
The shortlisted candidate should have strong communication, interpersonal, analytical, and problem-solving skills. Should have an ability to effectively communicate complex technical concepts within the team, and is able to work individually or as part of a team to achieve project goals.You are required to have skills in the following areas :4-6 yrs of experience in BI environmentAWS, Python, PYSPARK, AWS PaaS, Big Data.Desirable Skills: AWS GlueResponsibilities:Strong Experience in AWS Data and Analytics Services (2 years) AWS Cloud Glue, S3, Lambda, EMR, Athena, IAM, Redshift, Data Pipeline, Step Function, Kinesis, DynamoDB, RDS, etcStrong knowledge & hands-on experience in Spark with Python (2 years)Strong knowledge & hands-on experience in Python Pandas (2 years)Strong knowledge & hands-on experience in Big Data/Hadoop Stack like Hive, Hbase, HDFS, etcExtensive experience in designing, developing, and deploying Bigdata pipelinesStrong hands-on in SQL development (2 years) and in-depth understanding of optimization and tuning techniques in SQL.Experience in developing & integrating in Databricks notebooks using Spark (1 years) and build and deploying pipelines on DataBricks Cloud - Delta LakeDevelopment in Notebooks (like Jupyter/DataBricks/Zeppelin etc) (1 years)Good to have:Develop and schedule jobs in Airflow (1 years)Experience in working with databases such as Redshift, SnowflakesGood hands-on UNIX/LINUX (2 years) environment with shell scripting knowledge.Education: Any GraduateIndustry: Software Services, IT-Software