Sr. Staff Engineer, Data

See more jobs from Netskope Inc

4 months old

Apply Now

About Netskope

Today, there's more data and users outside the enterprise than inside, causing the network perimeter as we know it to dissolve. We realized a new perimeter was needed, one that is built in the cloud and follows and protects data wherever it goes, so we started Netskope to redefine Cloud, Network and Data Security. 

Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, St. Louis, Bangalore, London, Melbourne, Taipei, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events (pre and hopefully post-Covid) and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive.  Visit us at Netskope Careers. Please follow us on LinkedIn and Twitter@Netskope.

About the role

Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience.

The Data Engineering team builds and optimizes systems spanning data ingestion, processing, storage optimization and more. We work closely with engineers and the product team to build highly scalable systems that tackle real-world data problems and provide our customers with accurate, real-time, fault tolerant solutions to their ever-growing data needs. We support various OLTP and analytics environments, including our Advanced Analytics and Digital Experience Management products.

We are looking for skilled engineers experienced with building and optimizing cloud-scale distributed systems to develop our next-generation ingestion, processing and storage solutions. You will work closely with other engineers and the product team to build highly scalable systems that tackle real-world data problems. Our customers depend on us to provide accurate, real-time and fault tolerant solutions to their ever growing data needs. This is a hands-on, impactful role that will help lead development, validation, publishing and maintenance of logical and physical data models that support various OLTP and analytics environments. 

What's in it for you

  • You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics
  • Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products
  • You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills.

What you will be doing

  • Designing and implementing planet scale distributed data platforms, services and frameworks including solutions to address high-volume and complex data collections, processing, transformations and analytical reporting
  • Working with the application development team to implement data strategies, build data flows and develop conceptual data models
  • Understanding and translating business requirements into data models supporting long-term solutions
  • Analyzing data system integration challenges and proposing optimized solutions
  • Researching to identify effective data designs, new tools and methodologies for data analysis
  • Providing guidance and expertise to the development community in effective implementation of data models and building high throughput data access services
  • Providing technical leadership in all phases of a project from discovery and planning through implementation and delivery

Required skills and experience

  • 10+ years of hands-on experience in architecture, design or development of enterprise data solutions, applications, and integrations
  • Ability to conceptualize and articulate ideas clearly and concisely
  • Excellent algorithms, data structure, and coding skills with either Java, Python or Scala programming experience
  • Proficiency in SQL
  • Experience building products using one from each of the following distributed technologies: 
    • Relational Stores (i.e. Postgres, MySQL or Oracle)  
    • Columnar or NoSQL Stores (i.e. Big Query, Clickhouse, or Redis)   
    • Distributed Processing Engines (i.e. Apache Spark, Apache Flink, or Celery)   
    • Distributed Queues (i.e. Apache Kafka, AWS Kinesis or GCP PubSub)
  • Experience with software engineering standard methodologies (e.g. unit testing, code reviews, design document)
  • Experience working with GCP, Azure, AWS or similar cloud platform technologies a plus
  • Excellent written and verbal communication skills
  • Bonus points for contributions to the open source community

Education

  • BSCS or equivalent required, MSCS or equivalent strongly preferred

#LI-SC3

Netskope is committed to implementing equal employment opportunities for all employees and applicants for employment. Netskope does not discriminate in employment opportunities or practices based on religion, race, color, sex, marital or veteran statues, age, national origin, ancestry, physical or mental disability, medical condition, sexual orientation, gender identity/expression, genetic information, pregnancy (including childbirth, lactation and related medical conditions), or any other characteristic protected by the laws or regulations of any jurisdiction in which we operate.

Netskope respects your privacy and is committed to protecting the personal information you share with us, please refer to Netskope's Privacy Policy for more details.