Job description
The mission of TikTok's Global Security Organization is to build and earn trust by reducing risk and securing our businesses and products.
Also known as GSO, this team is the foundation of our efforts to keep TikTok safe, secure, and operating at scale for over 1 billion people around the world.
We work to ensure that the TikTok platform is safe and secure, that our users' experience and their data remains safe from external or internal threats, and that we comply with global regulations wherever TikTok operates.
Trust is one of TikTok's biggest initiatives, and security is integral to our success.
In whatever ways users interact with us — whether they're watching videos on their For You page, interacting with a Live video, or buying products on TikTok Shop — GSO protects their data and privacy, so they can have a secure and trustworthy experience.
The data engineer for forensics builds and maintains robust data pipelines that ensure data is collected, processed, and produced in a forensically sound and legally defensible manner Responsibilities:
- Big data management: Architect and manage large-scale data storage solutions like data lakes and warehouses to handle petabytes of data from servers, network devices, and cloud environments.
- Data integrity and chain of custody: Ensure the integrity of the data is maintained throughout the entire e-discovery lifecycle.
This includes documenting and maintaining a secure chain of custody to ensure the data is admissible in court.
- Data management and loading: Ingest, integrate, and organize processed data into review platforms for attorneys to analyze.
This includes handling various native file formats and metadata.
- Workflow automation: Build and maintain data pipelines to automate and orchestrate e-discovery processes, including data ingestion, processing, and production.
This increases efficiency and reduces manual errors.
- Processing and culling: Use specialized e-discovery software (., Relativity, Nuix) to process massive datasets by filtering, de-duplicating, and preparing the data for review.
The engineer applies advanced search techniques to cull the data and reduce the volume of information for legal review.
- Create requirements, work with stakeholders to design, implement, and deploy security tools, to improve the efficiency of forensics
Minimum Qualifications:
- Programming: Strong proficiency in Python for building automated data pipelines and scripts.
- Database expertise: In-depth knowledge of SQL, as well as NoSQL databases, and experience with database architecture and data modeling.
- Big data frameworks: Experience with tools like Apache Spark and Kafka for processing and streaming large volumes of data.
- Cloud computing: Proficiency with cloud platforms such as AWS, Azure, and Google Cloud, which are often central to large-scale data investigations.
- Forensic tools: Knowledge of both traditional forensic tools (., FTK, EnCase, Autopsy, Relativity, Nuix) and how to integrate their outputs into a larger data processing system.
Preferred Qualifications: - 5+ years of applicable experience in Data Engineering, Programming etc
- Digital forensics process: A clear understanding of the standard digital forensics process, including the phases of acquisition, preservation, analysis, and reporting.
- Legal and ethical compliance: Knowledge of relevant laws, regulations (like GDPR and CCPA), and ethical considerations surrounding digital evidence collection and privacy.
- Chain of custody: An understanding of the strict procedural requirements for maintaining the integrity and legal admissibility of digital evidence.
Required Skill Profession
Computer Occupations