FEDERAL RESERVE BANK OF SAN FRANCISCO
Our Work Protects the Dollars of Everyday Americans
CLOUD ENGINEER (DATA ENGINEER)
LOCATION: Salt Lake City, UT or Los Angeles, CA or Seattle, WA or Portland, OR or Phoenix, AZ
Why Work For The Fed?
At the Federal Reserve Bank of San Francisco, we believe in the diversity of our people, ideas, and experiences and are committed to building an inclusive culture that is representative of the communities we serve.
Are you passionate about large bank supervision and being a part of a dynamic team? Interested in an opportunity to collaborate with colleagues in the 12th District and across the Federal Reserve System (FRS) in a wide range of supervisory activities related to large firms’ financial resilience? If yes – then read on!
Fulfilling Careers That Make a Difference
The Federal Reserve Bank of San Francisco is looking for a Cloud Engineer to join the Advanced Data and Analytics Capabilities Team. We are a team based out of San Francisco that partners with business lines across the Federal Reserve System to deliver big data and advanced analytics products and solutions. In this role, you will have the opportunity to contribute to several high-quality data solutions and enhance your technical skills across many disciplines. We employ state of the art technologies that are part of the Hadoop ecosystem, which includes tools used for data integration, data modeling, and data analytics. You will have an opportunity to apply your critical thinking and technical skills across many disciplines.
In this role, you will contribute to high quality technology solutions that address business needs by developing solutions for the platform or applications for the customer business lines. You should have strong communication skills as you will work closely with other groups, including development and testing efforts of your assigned application components to ensure the successful delivery of the project.
Highlights of Responsibilities:
Design, develop, and maintain end to end data solutions using open source, modern data lake, and enterprise data warehouse technologies (Hadoop, Spark, Cloud, etc.)
Contribute to multiple data solutions throughout their entire lifecycle (conception to launch)
Partner with business stakeholders to understand and meet their data requirements
Design, build, and maintain machine learning data pipelines
Maintain security in accordance with Bank security policies
Participate in an Agile development environment
Develop code in Big Data environments using Java/Python etc.
Lead and communicate to technical and business product managers, as well as third parties, on solution design
Act as a role model, thought leader, and change management for new software and technology throughout the company
Work on multiple projects as a technical team member or lead driving elaboration, design and development of software
Collaborate with Developers, DevOps, Release Management and Operations Maintain security in accordance with Bank security policies Participate in an Agile development environment by attending daily standups and sprint planning activities
- Develop, execute, and document unit test plans and support application testing
- Provide operational support for applications and utilities Tackle issues and participate in defect and incident root cause analyses
- Assist in the deployment of new modules, upgrades, and fixes to the production environment
- Independently determine methods and procedures on new assignments, and may provide work direction to others
- Bachelors degree in Computer science, Information Systems, or other related field or relevant work experience
- 5+ years data engineering and programming skills in Java and/or Python, including knowledge of Big Data ecosystem
- Experience with the Hadoop ecosystem including HDFS data distribution, processing, workflow, Hive/Impala/Spark/Oozie
- Experience programming and scripting on UNIX / Linux. (i.e. Python or Bash)
- Experience with CTRL-M, Cron and scheduling of batch jobs
- Experience performing operational support
- Passion for technology and data, a critical thinker, problem solver and a self-starter
- Strong quantitative and analytical skills
- Ability to communicate effectively (both verbal and written) and work in a team environment
- Ability to balance multiple assignments and shift gears when new priorities arise
- Familiar with Agile methodologies
- Must be a U.S Citizen or Green Card holder with intent to become U.S. Citizen
- Working experience at Government or quasi-Government organizations Cloud experience and using big data technologies on the Cloud
- Professional experience optimizing machine learning workflows and maintaining data pipelines
- Hands-on experience with a variety of big data (Hadoop / Cloudera, Cloud, etc.) and machine learning (Spark, AWS SageMaker, etc.) technologies
- Medical, Dental and Vision
- Defined Benefit Pension Plan
- Pre-tax Flexible Spending Account
- Backup Child Care Program
- Pre-tax Day Care Flexible Spending Account
- Vacation, Days Sick Days, and Paid Holidays
- Pet Insurance
- Matching 401(k)
At the Federal Reserve Bank of San Francisco, we offer a wonderful benefits package including: Medical, Dental, Vision, Pretax Flexible Spending Account, Paid Family Leave Care, Backup Child Care Program, Pretax Day Care Flexible Spending Account, Vacation Days, Sick Days, Paid Holiday’s, Pet Insurance, Matching 401(k), and an unheard of Retirement / Pension.
We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment.
At the Federal Reserve Bank of San Francisco, we believe in the diversity of our people, ideas, and experiences and are committed to building an inclusive culture that is representative of the communities we serve. The Federal Reserve Bank of San Francisco is an Equal Opportunity Employer.