Staff DevOps Engineer - Hadoop - Big Data - Federal
Company: ServiceNow
Location: Santa Clara
Posted on: May 12, 2022
|
|
Job Description:
Company DescriptionAt ServiceNow, our technology makes the world
work for everyone, and our people make it possible. We move fast
because the world can't wait, and we innovate in ways no one else
can for our customers and communities. By joining ServiceNow, you
are part of an ambitious team of change makers who have a restless
curiosity and a drive for ingenuity. We know that your best work
happens when you live your best life and share your unique talents,
so we do everything we can to make that possible. We dream big
together, supporting each other to make our individual and
collective dreams come true. The future is ours, and it starts with
you.With more than 7,400+ customers, we serve approximately 80% of
the Fortune 500, and we're on the 2021 list of FORTUNE World's Most
Admired Companies -.Learn more on Life at Now blog and hear from
our employees about their experiences working at ServiceNow.Job
DescriptionPlease Note: This position will include supporting our
US Federal customers.This position requires passing a ServiceNow
background screening, USFedPASS (US Federal Personnel Authorization
Screening Standards). This includes a credit check,
criminal/misdemeanor check and taking a drug test. Any employment
is contingent upon passing the screening. Due to Federal
requirements, only US citizens, US naturalized citizens or US
Permanent Residents, holding a green card, will be considered.The
Big Data team plays a critical and strategic role in ensuring that
ServiceNow can exceed the availability and performance SLAs of the
ServiceNow Platform powered Customer instances - deployed across
the ServiceNow cloud and Azure cloud. Our mission is to:Deliver
state of the art Monitoring, Analytics and Actionable Business
Insights by employing new tools, Big Data systems, Enterprise Data
Lake, AI, and Machine Learning methodologies that improve
efficiencies across a variety of functions in the company: Cloud
Operations, Customer Support, Product Usage Analytics, Product
Upsell Opportunities enabling to have a significant impact both on
the top-line and bottom-line growth. The Big Data team is
responsible for:Collecting, storing, and providing real-time access
to large amount of dataProvide real-time analytic tools and
reporting capabilities for various functions including:Monitoring,
alerting, and troubleshootingMachine Learning, Anomaly detection
and Prediction of P1sCapacity planningData analytics and deriving
Actionable Business InsightsWhat you get to do in this
roleResponsible for deploying, production monitoring, maintaining
and supporting of Big Data infrastructure, Applications on
ServiceNow Cloud and Azure environments.Architect and drive the
end-end Big Data deployment automation from vision to delivering
the automation of Big Data foundational modules (Cloudera CDP),
prerequisite components and Applications leveraging Ansible,
Puppet, Terraform, Jenkins, Docker, Kubernetes to deliver end-end
deployment automation across all ServiceNow environments.Automate
Continuous Integration / Continuous Deployment (CI/CD) data
pipelines for applications leveraging tools such as Jenkins,
Ansible, and Docker.Performance tuning and troubleshooting of
various Hadoop components and other data analytics tools in the
environment: HDFS, YARN, Hive, HBase. Spark, Kafka, RabbitMQ,
Impala, Kudu, Redis, Hue, Kerberos, Tableau, Grafana, MariaDB, and
Prometheus.Provide production support to resolve critical Big Data
pipelines and application issues and mitigating or minimizing any
impact on Big Data applications. Collaborate closely with Site
Reliability Engineers (SRE), Customer Support (CS), Developers, QA
and System engineering teams in replicating complex issues
leveraging broad experience with UI, SQL, Full-stack and Big Data
technologies.Responsible for enforcing data governance policies in
Commercial and Regulated Big Data environments.QualificationsTo be
successful in this role you have:6+ years of overall experience
with at least 4+ years as a Big Data DevOps / Deployment
EngineerDemonstrated expert level experience in delivering end-end
deployment automation leveraging Puppet, Ansible, Terraform,
Jenkins, Docker, Kubernetes or similar technologies.Deep
understanding of Hadoop/Big Data Ecosystem. Good knowledge in
Querying and analyzing large amount of data on Hadoop HDFS using
Hive and Spark Streaming and working on systems like HDFS, YARN,
Hive, HBase. Spark, Kafka, RabbitMQ, Impala, Kudu, Redis, Hue,
Tableau, Grafana, MariaDB, and Prometheus.Experience securing
Hadoop stack with Sentry, Ranger, LDAP, Kerberos KDCExperience
supporting CI/CD pipelines on Cloudera on Native cloud and
Azure/AWS environmentsGood knowledge of Perl, Python, Bash, Groovy
and Java.In-depth knowledge of Linux internals (Centos 7.x) and
shell scriptingAbility to learn quickly in a fast-paced, dynamic
team environmentJB0019084Additional InformationServiceNow is an
Equal Employment Opportunity Employer. All qualified applicants
will receive consideration for employment without regard to race,
color, creed, religion, sex, sexual orientation, national origin or
nationality, ancestry, age, disability, gender identity or
expression, marital status, veteran status or any other category
protected by law.All new employees hired in the United States are
required to be fully vaccinated against COVID-19, subject to such
exceptions as required by law. If hired, you will be required to
submit proof of full vaccination or have an approved accommodation,
by your start date. Visit our Candidate FAQ page to learn more.If
you require a reasonable accommodation to complete any part of the
application process, or are limited in the ability or unable to
access or use this online application process and need an
alternative method for applying, you may contact us at
talent.acquisition@servicenow.com for assistance.For positions
requiring access to technical data subject to export control
regulations, including Export Administration Regulations (EAR),
ServiceNow may have to obtain export licensing approval from the
U.S. Government for certain individuals. All employment is
contingent upon ServiceNow obtaining any export license or other
approval that may be required by the U.S. Government.Please Note:
Fraudulent job postings/job scams are increasingly common. Click
here to learn what to watch out for and how to protect yourself.
All genuine ServiceNow job postings can be found through the
ServiceNow Careers site.Work PersonasWork personas are categories
that are assigned to employees depending on the nature of their
work. Employees will fall into one of three categories: Remote,
Flexible or Required in Office.Required in OfficeA required in
office work persona is defined as an employee who is contracted to
work from or aligned to a ServiceNow-affiliated office. This
persona is required to work from their assigned workplace location
100% of the work week based on the business needs of their
role.FlexibleA flexible work persona is defined as an employee who
is contracted to work from or aligned to a ServiceNow-affiliated
office and will work from their assigned workplace location roughly
3 days/week or less (generally around 40-60% of the work week).
Flexible employees may choose to work the remaining working time
from their workplace location or home. Flexible employees are
required to work within their state, province, region, or country
of employment.RemoteA remote work persona is defined as an employee
who performs their responsibilities exclusively outside of a
ServiceNow workplace and is not contracted or aligned to a
ServiceNow-affiliated office, including those whose place of work
(pursuant to their terms and conditions of employment) is their
home. Remote employees are required to work within their state,
province, region, or country of employment.
Keywords: ServiceNow, Santa Clara , Staff DevOps Engineer - Hadoop - Big Data - Federal, Engineering , Santa Clara, California
Click
here to apply!
|