The App Retailer was created to be a protected and trusted place for customers to get apps, and a terrific enterprise alternative for builders. Apple platforms and the apps you construct have develop into vital to many households, as kids use our services to discover the digital world and talk with household and pals. We maintain apps for teenagers and people with user-generated content material and interactions to the very best requirements. To proceed delivering protected experiences for households collectively, we wished to remind you in regards to the instruments, assets, and necessities which can be in place to assist preserve customers protected in your app.
Made for Children
In case you have an app that’s meant for teenagers, we encourage you to make use of the Children class, which is designed for households to find age-appropriate content material and apps that meet increased requirements that defend kids’s information and supply added safeguards for purchases and permissions (e.g., for Digital camera, Location, and so on).
Study extra about constructing apps for Children.
Parental controls
Your app’s age ranking is built-in into our working programs and works with parental management options, like Display Time. Moreover, with Ask To Purchase, when children wish to purchase or obtain a brand new app or in-app buy, they ship a request to the household organizer. You can even use the Managed Settings framework to make sure the content material in your app is acceptable for any content material restrictions that will have been set by a mum or dad. The Display Time API is a robust instrument for parental management and productiveness apps to assist mother and father handle how kids use their units. Study extra in regards to the instruments we offer to assist mother and father to assist them know, and be ok with, what children are doing on their units.
Delicate and inappropriate content material
Apps with user-generated content material and interactions should embody a set of safeguards to guard customers, together with a technique for filtering objectionable materials from being posted to the app, a mechanism to report offensive content material and assist well timed responses to considerations, and the power to dam abusive customers. Apps containing advertisements should embody a method for customers to report inappropriate and age-inappropriate advertisements.
iOS 17, iPadOS 17, macOS Sonoma, and watchOS 10, introduce the power to detect and alert customers to nudity in photos and movies earlier than displaying them onscreen. The Delicate Content material Evaluation framework makes use of on-device know-how to detect delicate content material in your app. Tailor your app expertise to deal with detected delicate content material appropriately for customers which have Communication Security or Delicate Content material Warning enabled.
Supporting customers
Customers have a number of methods to report points with an app, like Report a Downside. Customers also can talk app suggestions to different customers and builders by writing evaluations of their very own; customers can Report a Concern with different particular person person evaluations. You need to intently monitor your person evaluations to enhance the security of your app, and have the power to handle considerations straight. Moreover, in case you imagine one other app presents a belief or security concern, or is in violation of our pointers, you possibly can share particulars with Apple to research.
These person evaluation instruments are important to informing the work we do to maintain the App Retailer protected. Apple deploys a mixture of machine studying, automation, and human evaluation to observe considerations associated to abuse submitted through person evaluations and Report a Downside. We monitor for subjects of concern similar to studies of fraud and scams, copycat violations, inappropriate content material and promoting, privateness and security considerations, objectionable content material and youngster exploitation; and use methods similar to semi-supervised Correlation Rationalization (CorEx) fashions, and Bidirectional Encoder Representations from Transformers (BERT)-based massive language fashions particularly educated to acknowledge these subjects. Flagged subjects are then surfaced to our App Assessment crew, who examine the app additional and take motion if violations of our pointers are discovered.
We imagine we’ve got a shared mission with you as builders to create a protected and trusted expertise for households, and sit up for persevering with that vital work. Listed below are some assets that you could be discover useful:
Delicate Content material Evaluation framework
Study Rankings, Evaluations, and Responses
Report a Belief & Security concern associated to one other app
Study in regards to the ScreenTime Framework
Study constructing apps for Children