Blog

  •   admin
  •   
  •   0
  • Position: Big Data Architect

    We are looking for Big Data Lead with 6-10 years of total Experience & 3-5 years of Big Data experience for our Fortune 500 clients, primarily in North America.

    Responsibilities:

    • Demonstrate ability to translate function/high level design in detailed technical design
    • Understand basic relational database concepts and design schema based on the design
    • Understand and follow the defined processes in the projects
    • Do solid design, coding, testing and debugging
    • Handle a team of 2 or 3 juniors and mentor them
    • Help debug client side issues in production or UAT
    • Completely own a module and see it to completion
    • Design and implement Java and Big Data software components from scratch using specifications provided
    • Development of key technologies that support data management involving collection, processing, transformation, storage, and analytics
    • Support implementation of application software releases and other related activities
    • Provide debugging and code analysis support
    • Troubleshoot production issues

    Requirements/Qualifications:

    • Proven architecture and infrastructure design experience in big data batch and real time technologies like Hadoop, Hive, Pig, HBase, Map Reduce, SPARK and STORM, Kafka
    • Experience with Big Data Analytics : ETL, in-stream processing, batch processing, querying, workflows and workflow & query optimization
    • Expert in writing SQL scripts
    • Knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2, Elastic MapReduce) and considerations for scalable, distributed systems a plus
    • Expert knowledge in an enterprise class RDBMS
    • Excellent verbal/written communication & data presentation skills, including ability to succinctly summarize key findings and effectively communicate with both business and technical teams. Ability to balance and prioritize multiple conflicting requirements with high attention to detail
    • Comfortable working in a Linux environment
    • Experience with scripting language such as Python, Perl, Ruby or Javascript
    • Knowledge of AWS products and services
    • Exposure to predictive/advanced analytics and tools (such as R, SAS, Matlab)
    • Exposure to noSQL databases (such as DynamoDB, MongoDB)
    • Lead architecture, technology selection and implementation of a data platform over big data technologies
    • Contribute to the technical design efforts – ensuring the system design meets scalability and performance requirements
    • Provide technical mentorship for junior engineers on new technologies and development techniques
    • Firm understanding of major programming/scripting languages like Java, Python, R
    • Experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture
    Apply Now


    CATEGORIES

    • No categories

    Position: Big Data Architect

    We are looking for Big Data Lead with 6-10 years of total Experience & 3-5 years of Big Data experience for our Fortune 500 clients, primarily in North America.

    Responsibilities:

    • Demonstrate ability to translate function/high level design in detailed technical design
    • Understand basic relational database concepts and design schema based on the design
    • Understand and follow the defined processes in the projects
    • Do solid design, coding, testing and debugging
    • Handle a team of 2 or 3 juniors and mentor them
    • Help debug client side issues in production or UAT
    • Completely own a module and see it to completion
    • Design and implement Java and Big Data software components from scratch using specifications provided
    • Development of key technologies that support data management involving collection, processing, transformation, storage, and analytics
    • Support implementation of application software releases and other related activities
    • Provide debugging and code analysis support
    • Troubleshoot production issues

    Requirements/Qualifications:

    • Proven architecture and infrastructure design experience in big data batch and real time technologies like Hadoop, Hive, Pig, HBase, Map Reduce, SPARK and STORM, Kafka
    • Experience with Big Data Analytics : ETL, in-stream processing, batch processing, querying, workflows and workflow & query optimization
    • Expert in writing SQL scripts
    • Knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2, Elastic MapReduce) and considerations for scalable, distributed systems a plus
    • Expert knowledge in an enterprise class RDBMS
    • Excellent verbal/written communication & data presentation skills, including ability to succinctly summarize key findings and effectively communicate with both business and technical teams. Ability to balance and prioritize multiple conflicting requirements with high attention to detail
    • Comfortable working in a Linux environment
    • Experience with scripting language such as Python, Perl, Ruby or Javascript
    • Knowledge of AWS products and services
    • Exposure to predictive/advanced analytics and tools (such as R, SAS, Matlab)
    • Exposure to noSQL databases (such as DynamoDB, MongoDB)
    • Lead architecture, technology selection and implementation of a data platform over big data technologies
    • Contribute to the technical design efforts – ensuring the system design meets scalability and performance requirements
    • Provide technical mentorship for junior engineers on new technologies and development techniques
    • Firm understanding of major programming/scripting languages like Java, Python, R
    • Experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture
    Apply Now