Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Bangalore |
Education | Not Mentioned |
Salary | Not Disclosed |
Industry | IT - Software |
Functional Area | General / Other Software |
EmploymentType | Full-time |
Data Engineer: Big DataYour Role andResponsibilitiesAs Data Engineer, you are responsible to designing, developing, optimizing and troubleshooting complex dataintensive ETL process.Responsibilities: Demonstrate expertise in Neo4J to build to store, query, analyze and manage highly connecteddata using native graph storage, native graph processing, graph scalability, high availability, graphclustering, graphs in the cloud, graphs on Spark, built-in ETL, and integration support Develop solutions in Jupyter Lab and/or Apache Zeppelin with Python and PySpark Responsible for graph analysis algorithms with Cypher and common Neo4j APIs (e.g., Neo4j)If you thrive in a dynamic, collaborative workplace, IBM provides an environment where you will be challengedand inspired every single day. And if you relish the freedom to bring creative, thoughtful solutions to the table,there's no limit to what you can accomplish here.Posting Country IndiaState / Province KARNATAKACity / Township /Village BangaloreTravel Required Up to 10% or 1 day a weekPosition Type ProfessionalRequired Education Bachelor's DegreeRequired Technical andProfessional Expertise Minimum 5+ years of experience in IT IndustryHands on experience in Neo4j Graph database, Cipher query writing, building pipelines usingPython Experience of ETL technical design, automated data quality testing, QA and documentation, datawarehousing and data modelling capabilities Extensive experience using RDMS, ETL pipelines, Python, Hadoop, SQL and data wrangling Solid experience in technologies covering areas like data integration, data ingestion, Cloud, realtime streaming (Kafka), CDC, Impala NoSql database and data warehousing technologies Working knowledge of API Microservices Expertise in DB architecture and patterns Hands-on experience on RESTFul API and knowledge of modern code development practicesEligibilityRequirements NAPreferred Technicaland ProfessionalExperienceYou love collaborative environments that use agile methodologies to encourage creative designthinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals andwith creative approach to work Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results asneeded Up-to-date technical knowledge by attending educational workshops, reviewing publicationsPreferred Education Master's DegreeExperience :- 5 + year
Keyskills :
data wranglingdata warehousingdata qualitytime managementinterpersonal skillshigh availabilityagile methodologiessqlinformaticatechnical designeducational workshopsquality testingetlquery writingdata integrationjavadata modelingpyth