hireejobs
Hyderabad Jobs
Banglore Jobs
Chennai Jobs
Delhi Jobs
Ahmedabad Jobs
Mumbai Jobs
Pune Jobs
Vijayawada Jobs
Gurgaon Jobs
Noida Jobs
Oil & Gas Jobs
Banking Jobs
Construction Jobs
Top Management Jobs
IT - Software Jobs
Medical Healthcare Jobs
Purchase / Logistics Jobs
Sales
Ajax Jobs
Designing Jobs
ASP .NET Jobs
Java Jobs
MySQL Jobs
Sap hr Jobs
Software Testing Jobs
Html Jobs
IT Jobs
Logistics Jobs
Customer Service Jobs
Airport Jobs
Banking Jobs
Driver Jobs
Part Time Jobs
Civil Engineering Jobs
Accountant Jobs
Safety Officer Jobs
Nursing Jobs
Civil Engineering Jobs
Hospitality Jobs
Part Time Jobs
Security Jobs
Finance Jobs
Marketing Jobs
Shipping Jobs
Real Estate Jobs
Telecom Jobs

Data Architech Associate Sr

3.00 to 6.00 Years   Hyderabad   04 Apr, 2019
Job LocationHyderabad
EducationNot Mentioned
SalaryNot Disclosed
Industryanking / Financial Services
Functional AreaGeneral / Other Software
EmploymentTypeFull-time

Job Description

As a Business Data Architect & Data Engineer Lead, you will work as a consultative team member. Business Data Architect function as Hadoop team leads for engagements to support the implementation of AtScale or other Hadoop OLAP tools (AtScale, Kyvos or other OLAP on Hadoop). Engaging from Proof of Concept (POC) stages through to implementation of complex distributed production environment, you work collaboratively to successfully deploy and implement AtScale in the production environment. You ve got the technical depth to roll up your sleeves to work with Hadoop and Hive and the polish present best practices and other solutions as needed.The candidate must have the ability to partner with stakeholders inside and outside of the department to understand their needs and support the design and development of robust data solutions that meet and/or exceed customer expectations. The ideal candidate will possess JPM institutional data and system knowledge, technical skills, an understanding of data science, and a commitment to producing high quality results. Experience with leading teams delivering solutions through agile methodology and collaborating on backlog grooming and prioritization will be a strong plus.Other responsibilities will include the following:

  • Facilitate use case & requirements definition, design, testing, and implementation of new data assets and analytic capabilities that address specific business needs
  • Designing and architecting solutions, scoping new engagements and implementations both short and long term, and guiding the team during product implementations.
  • Resolving technical issues and advising on best practices for big data, Hadoop environments and AtScale.
  • Driving successful installations of the product, configuration, tuning, and performance.
  • Assisting with capacity planning for the environment to scale.
  • Write and produce technical documentation.
  • Being meticulous about tracking things and follow-through.
  • Work with customers on how to use the AtScale Design Center for modeling solutions
  • Strong in Agile Development and experience with SCRUM or similar methodologies
  • Enable the management of data as a corporate asset: define data (metadata), identify systems of record and authoritative sources, create data quality rules, define security requirements, create data flow diagrams, and administer firm-wide principles, standards, and controls
  • Support technology data modelers towards the creation of conceptual & logical models to describe a particular domain of data and use these models to inform the physical design of data-related projects
  • Develop strategic roadmaps for the customer data domain that describe a sequence of projects to improve management and utility of the data for the business
  • Identify areas for efficiency across data domains, such as the elimination of duplicate data or platforms
  • Conduct business process analysis and identify data needed to support the processes and determine whether the firm s data is fit for use within a given process
Conduct research and development with emerging technologies, determine their applicability to business use cases, document & communicate their recommended use in the firmThe candidate must have an educational background in Computer Science, MIS, Software Engineering, or a related discipline. In addition, the candidate must have a minimum of a Bachelor s degree with 5 years of experience or a Master s degree with 2 years of experience.Required Skills
  • 5+ years working with Hadoop big data environments using MapReduce, Hive, SparkSQL, Sqoop, HDFS, and Spark
  • Experience in one of Big Data Consumption OLAP tools (AtScale, Kyvos or other OLAP on Hadoop).
  • Being meticulous about tracking things and follow-through.
  • Work with customers on how to use the AtScale Design Center for modeling solutions
  • BI Experience a must (Tableau, Cognos, Qlik, BO, SSAS)
  • Java, Scala, Python, Shell Scripting a plus
  • 3+ years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions
  • Experience in Hadoop related tools and technologies is a must (HDFS and/or MapReduce, HBase, Hive, Spark, Impala)
  • Java, Scala, Python, Shell Scripting a plus
  • Ability to understand big data use-cases and recommend standard design patterns commonly used in Hadoop-based deployments
  • Familiarity with data warehousing concepts, especially OLAP
  • Familiarity with semantic layers and creating dimensional models.
  • Knowledge of complex data pipelines and data transformation
  • Coding skills to profile, wrangle, and prepare data from diverse sources to support analytical efforts
  • Write data transformation logic in languages such as Python, SAS, Spark, or Scala.
  • Minimum five years of Applications Architect experience technology implementation experience and certifications a plus.
  • In-depth knowledge of business and wholesale banking data
  • Knowledge of version control tools and processes (e.g. Subversion, Git)
  • Technical understanding of common RDBMS systems; (e.g. Teradata, Oracle)
  • Understanding of Hadoop-related technologies & their applications
  • Familiarity with various data modeling techniques and tools with knowledge of the SQL language
  • Must have ability to deliver high-quality results under tight deadlines and be comfortable managing large projects
Preferred Skills
  • Strong skills in project management methodologies and tools Agile Training and certification a plus.
  • Advanced analytical thinking and problem solving skills
  • Proven experience with data infrastructure initiatives, best practices and key components
  • Knowledge of the business lending, card and deposit lifecycle, including origination, sale/servicing, default management/loss mitigation and fraud.
  • Ability to Create conceptual & logical models to describe a particular domain of data and use these models to inform the physical design of data-related projects
,

Data Architech Associate Sr Related Jobs

© 2019 Hireejobs All Rights Reserved