Back to Blog
https://mashable.com/article/sora-2-ai-app-weird-videos
News
5 months ago

OpenAI’s Sora: An Objective Overview of the New AI Video App

Published October 2, 2025

What is Sora?

Sora is a new social app featuring short videos generated by OpenAI models (currently “Sora 2”). The app is presently available in an early-access program with a TikTok-like feed. It enables the creation of realistic video clips with synthetically generated visuals and audio, and quick sharing within the app. The initial rollout covered North American markets in an invite-only format.

Sora and Key Features

  • Video and audio generation: Short clips created from text prompts and/or an input image, with a synthetic audio track.
  • “Cameo” (user likeness): The ability to upload your own likeness and voice to insert yourself into generated scenes; control settings are available.
  • Algorithmic feed and user profile: Browse a stream of content and publish your own clips from an in-app account.

Moderation and Safeguards

OpenAI states that it applies a set of guardrails and usage policies, including restrictions on sexual content, violence, extremism, hate speech, and content promoting self-harm. The company also indicates limits and monitoring around the use of third-party likenesses. Parental controls have been introduced for teen accounts.

Sora: Early Post-Launch Observations

After launch, the app quickly featured deepfakes of public figures, historical characters, and pop-culture references. Some clips gained significant popularity while also prompting discussion about copyright compliance and broader risks. OpenAI reported removing accounts that violated its rules and announced further strengthening of safeguards.

Legal and Ethical Considerations

  • Copyrights and trademarks: Early tests suggest some characters and franchises may be blocked, while others (e.g., certain game-related motifs) appeared in content; the scope and effectiveness of filters remain under discussion.
  • Likeness and consent: The cameos feature relies on user biometrics. Abuse risks (e.g., generating content without consent) are well covered in the synthetic-media literature, hence a combination of technical constraints and policy rules has been applied.
  • Disinformation: Media experts note that improving quality of generated clips makes synthetic content harder to distinguish from real footage, potentially affecting trust in online video.

Safety Context Around OpenAI Products

Separately from Sora, in August 2025 a lawsuit was filed against OpenAI concerning the alleged role of ChatGPT in a teenager’s death. The company announced changes to protections for minors and vulnerable groups. The case is ongoing and does not determine legal liability, but it is an important backdrop for discussions about AI product safety.

Privacy and Data Security in Sora

Using cameos involves processing biometric data (facial imagery, voice samples). From a compliance and security standpoint, recommended practices include:

  • Enabling usage restrictions on one’s likeness (who may use it),
  • Regularly reviewing permissions,
  • Providing informed consent and reviewing terms and content-usage policies.

Operational Takeaways for Companies

  • Internal policies: If an organization is considering Sora (e.g., for prototyping media), prepare guidelines on copyright, likeness rights, and biometric processing.
  • Content labeling: Consider requiring AI-generated labeling for external channels.
  • Risk monitoring: Track updates to moderation rules and regional availability, which may affect compliance and publishing workflows. (OpenAI periodically communicates changes to Sora’s safeguards and features.)

Ready to Start Trading AI Tokens?