hireejobs
Hyderabad Jobs
Banglore Jobs
Chennai Jobs
Delhi Jobs
Ahmedabad Jobs
Mumbai Jobs
Pune Jobs
Vijayawada Jobs
Gurgaon Jobs
Noida Jobs
Oil & Gas Jobs
Banking Jobs
Construction Jobs
Top Management Jobs
IT - Software Jobs
Medical Healthcare Jobs
Purchase / Logistics Jobs
Sales
Ajax Jobs
Designing Jobs
ASP .NET Jobs
Java Jobs
MySQL Jobs
Sap hr Jobs
Software Testing Jobs
Html Jobs
IT Jobs
Logistics Jobs
Customer Service Jobs
Airport Jobs
Banking Jobs
Driver Jobs
Part Time Jobs
Civil Engineering Jobs
Accountant Jobs
Safety Officer Jobs
Nursing Jobs
Civil Engineering Jobs
Hospitality Jobs
Part Time Jobs
Security Jobs
Finance Jobs
Marketing Jobs
Shipping Jobs
Real Estate Jobs
Telecom Jobs

Python with Any cloud

4.00 to 6.00 Years   Delhi   02 Mar, 2021
Job LocationDelhi
EducationNot Mentioned
SalaryNot Disclosed
IndustryIT - Software
Functional AreaGeneral / Other Software
EmploymentTypeFull-time

Job Description

Roles and ResponsibilitiesPython with Any cloud (AWS,AZURE,GCP) for Bangalore /chennai/Delhi location. pls find the below JDMandatory skills : Python, SQL, Any cloud ( AWS, AZURE, GCP) data Model.EXP 4yrs -6yrsNotice period: who are serving notice and upto 15 days joiners Location: Bangalore , Chennai, Delhi Location for Tredence Analytics ( Tredence.com) Must Have (Required Skills):Good to Have (Preferred Skills):

  • 2 - 3 years of experience in designing and developing Python programs for data curation and processing; Experience with object-oriented programming in Python
  • Knowledge of AWS storage, compute and serverless services, particularly S3, Lambda, Kinesis, SQS and Glue
  • Expertise in at least two of these database technologies: Relational, MPP and Distributed databases hosted in Cloud and On-Premise
  • 4 - 6 years overall experience in IT or professional services experience in IT delivery or large-scale IT analytics projects
  • Experience connecting and integrating with at least one of the following platforms: Google Cloud, Microsoft Azure, Amazon AWS
  • Experience with Redshift database that includes data modelling, data ingestion, integration, processing and provisioning from Redshift database
  • Implement data pipelines to automate the ingestion, transformation, and augmentation of data sources, and provide best practices for data pipeline operations
  • Able to work in a rapidly changing business environment and adopt to the fast-paced environment quickly
  • Advanced SQL writing and experience in data exploration and using databases in a business environment with complex datasets
  • Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations
  • Experience with programming in any of the languages - Java, PySpark and Spark
  • Exposure to Apache Airflow
  • Exposure to any open source / commercial ETL tools such as Talend, Informatica, DataStage etc.
  • Familiar with data quality and standardization including reference data management
  • Experience with catalog, lineage and metadata management
  • Exposure to DevOps / CI & CD - Tools / services and methodologies
  • Deploy logging and monitoring across the different integration points for critical alerts
  • Experience with different computing paradigms in databases such as In-Memory, Distributed, Massively Parallel Processing
  • Delivered data and analytics projects in any of the Cloud platforms (AWS / Azure / GCP)
  • Experience in delivering projects in a highly collaborative delivery model with teams at onsite and offshore
,

Keyskills :
javaawsetlopen sourceetl toolsdata qualityprofessional servicesreference datadata curationdata modelingwork effectivelyit deliverydistributed databasessqlmicrosoft azureobjectoriented programming

Python with Any cloud Related Jobs

© 2019 Hireejobs All Rights Reserved