Software Engineer, VIPER Big Data team

Sorry, this job was removed at 4:01 a.m. (MST) on Saturday, February 14, 2015
Find out who's hiring in Greater Denver Area.
See all Developer + Engineer jobs in Greater Denver Area
Apply
By clicking Apply Now you agree to share your profile information with the hiring company.

Software engineering and data science skills combined with the demands of a high volume, highly-visible analytics platform make this an exciting challenge for the right candidate.

Are you passionate about digital media, entertainment, and software services? Do you like big challenges and working within a highly motivated team environment?

As a software engineer in VIPER's Big Data team, you will research develop support and deploy solutions within the Hadoop ecosystem and real-time distributing computing architectures. You will also employ your skills to deliver insights into customer and network behavior on a rapidly-growing video-over-IP platform. The VIPER big data team is a new, small and fast-moving team of world-class experts who are innovating in end-to-end video delivery. We are a team that thrives on big challenges, results, quality, and agility.

Who does the big data software engineer work with?

Big Data software engineering is a diverse collection of professionals who work with a variety of teams ranging from other software engineering teams whose software integrates with analytics services, service delivery engineers who provide support for our product, testers, operational stakeholders with all manner of information needs, and executives who rely on big data for ad hoc reports and analytical dashboards. We are often called upon in a clinch when it comes to providing the solution to a question that nobody else can answer.

What are some interesting problems you’ll be working on?

Develop systems capable of processing billions of events per day, providing both a real time and historical view into the operation of our video-over-IP systems. Design collection and enrichment system components for scale and reliability. Work on high performance in-memory real time data stores and a massive historical data store using Hadoop.

Optimize metrics gathering and reporting for performance, using the best open source or vendor tool for the job. Build rich visualizations that tell a compelling story with data. Provide operational reports and dashboards to enable day-to-day business decisions.

Where can you make an impact?

Comcast VIPER is building the core components needed to drive the next generation of television. Running this infrastructure, identifying trouble spots, and optimizing the overall user experience is a challenge that can only be met with a robust big data architecture capable of providing insights that would otherwise be drowned in a sea of data.

 

Success in this role is best enabled by a broad mix of skills and interests ranging from traditional distributed systems software engineering prowess to the multidisciplinary field of data science.

Responsibilities:

·         Develop solutions to Big Data problems utilizing common tools found in the Hadoop ecosystem.

·         Develop solutions to real-time and off line event collecting from various systems.


·         Develop, maintain, and perform analysis within a real-time architecture supporting large amounts of data from various sources.

·         Analyze massive amounts of data and help drive prototype ideas for new tools and products.

·         Design, build and support APIs and services that are exposed to other internal teams

·         Employ rigorous continuous delivery practices managed under an agile software development approach

·         Ensure a quality transition to production and solid production operation of the software

Here are some of the specific technologies we use:

·         Hadoop

·         Flume

·         Storm

·         MemSQL

·         Java

·         Maven

·         Git

·         Jenkins

·         Splunk/Hunk

·         Apache Pig

·         Unix/Linux

Skills & Requirements:

·         3+ years programming experience

·         Bachelors or Masters in Computer Science or related discipline

·         Experience in software development of large-scale distributed systems – including proven track record of delivering backend systems that participate in a complex ecosystem.

·         Knowledge in Big Data related technologies and open source frameworks preferred.

·         Extensive experience programming in Java as well as experience in code optimization and high performance computing.

·         Experience with Java servlet containers or application servers such as JBoss, Tomcast, Glassfish, WebLogic, or Jetty.

·         Good current knowledge of Unix/Linux environments

·         Test-driven development/test automation, continuous integration, and deployment automation

·         Enjoy working with data – data analysis, data quality, reporting, and visualization

·         Good communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly.

·         Great design and problem solving skills, with a strong bias for architecting at scale.

·         Adaptable, proactive and willing to take ownership.

·         Keen attention to detail and high level of commitment.

·         Comfortable working in a fast-paced agile environment.  Requirements change quickly and our team needs to constantly adapt to moving targets.

Nice to haves:

·         Collection frameworks like Flume, Kafka or Splunk.

·         MapReduce experience in Hadoop utilizing Pig, Hive, or other query/scripting technology

·         Distributed (HBase or Cassandra or equivalent) or NoSQL (e.g. Mongo) database experience

·         Scripting tools such as Python

·         Git, Maven, Jenkins, Sonar, Nexus, Puppet

·         Understanding and/or experience with serialization frameworks such as Thrift, Avro, Google Protocol Buffers, and Kyro preferred.

·         Visualization tools & libraries, reporting tools, etc. Splunk Hunk is ideal.

·         Good understanding in any: advanced mathematics, statistics, and probability.

About Comcast VIPER (Video IP Engineering & Research):

VIPER (Video IP Engineering & Research), is a startup division within Comcast’s Technology and Product Division and spun out from IP Video and online projects originated within Comcast Interactive Media is based in downtown Denver, CO. We are a cloud-based, IP video infrastructure that’s been built to deliver a broad mix of on-demand video, live TV streams and an assortment of other digital media to an array of connected devices in the home. 

Read Full Job Description
Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.

Location

1401 Wynkoop Street, Denver, 80202

Similar Jobs

Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.
Learn more about Comcast VIPERFind similar jobs