Build the future of data. Join the Snowflake team.
The Snowflake Data Lake team’s mission is to power open standards with Snowflake innovation. Our customers want to bring more data to Snowflake to support their variety of data lake use cases with large data sets but face the common challenges of control, cost, and interoperability. This team aims to address these challenges and enable customers to benefit from Snowflake’s rich features and integrated platform capabilities while embracing their choice of open table standards (e.g., Apache Iceberg), file formats (e.g.,Apache Parquet), storage solutions, and third-party open source tool set (e.g.,Apache Spark). We’re on the early journey to build the best data lake solutions for any workload at scale.
We are seeking talented Senior Software Engineers who are technical leaders in the big data open source community to join us to define the strategy, engage and deliver innovation into the open source community, and bring Snowflake to millions of big data professionals.
AS A SENIOR SOFTWARE ENGINEER AT SNOWFLAKE, YOU WILL:
- Understand customer requirements and define product strategies.
- Design, develop, and operate highly reliable large scale data lake systems.
- Embrace Snowflake innovations with open source standards and tool sets.
- Be an active influencer for the direction of open source standards.
- Partner closely with Product teams to understand requirements and design cutting edge new capabilities that go directly into customer’s hands.
- Analyze fault-tolerance and high availability issues, performance and scale challenges, and solve them.
- Ensure operational excellence of the services and meet the commitments to our customers regarding reliability, availability, and performance.
IDEAL CANDIDATE WILL HAVE MOST OF THE FOLLOWING QUALIFICATIONS:
- 8+ years of hands-on experience in large scale data intensive distributed systems, especially in distributed file systems, object storage, data warehouse, data lake, data analytics, and data platform infrastructure.
- Strong development skills in Java and C++.
- An active PMC (Program Management Committee) or Committer to open source like Apache Iceberg, Parquet, Spark, Hive, Flink, Delta Lake, Presto, Trino, and Avro.
- Proven track record of leading and delivering large and complex big data projects across organizations.
- A growth mindset and excitement about breaking the status quo by seeking innovative solutions.
- An excellent team player who is consistent in making everyone around you better.
- Experience with public clouds (AWS, Azure, GCP) is a plus
- BS/MS in Computer Science or related major, or equivalent experience
Every Snowflake employee is expected to follow the company’s confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company’s data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential.