Minecraft Java's Controversial Feature: 1.19.10
With the recent release of the Minecraft 1.19 Wild Update, the community is in an uproar due to many anticipated features being cut off, like the Fireflies and Birch Forest revamp. However, as days go by, the uproar is slowly subsiding, but memories of these features will remain in the hearts of the many. New minor updates and fixes have been developed since then, and there’s already a 1.19.1 Release Candidate 1, but along with those versions is a feature implemented that could forever change the entire Minecraft community.
The feature, an in-game reporting system which was first introduced in Minecraft 1.19 java version Pre-Release 1 back on the 21st of June, 2022, was at first glance a good feature. This was then carried over to the 1.19.1 Release Candidate 1 2 days after, and is now available on Minecraft Launchers. What made this feature seem good is its power to ban bad players who break the Minecraft’s End Users License Agreement (EULA), which allows a more family friendly environment for everyone to play on.
This feature, however, has actually already been in the Bedrock Edition of the game for over 2 years now and received some backlash from the community. Mojang and Microsoft decided to implement it anyway.
Why the In-Game Reporting System?
According to the Entertainment Software Rating Board or ESRB, Minecraft is an age 10+ rating game. Other raters, such as PEGI and USK, rate Minecraft as ages 7+ and 6+ respectively. This means that Minecraft has very young players joining in servers as well and interacting with teenagers and adult players alike. These interactions can lead to the young players to learn inappropriate behaviors through those older players, and often can even get scammed or bullied by them. There are even servers with a pure toxic base, such as anarchy servers. Minecraft, however, wants their games to be safe and welcoming to all players from all walks of life, and to achieve that, they’ve implemented this feature to further help the community on catching players that dares to break those and their License Agreement.
How to Report Players
The reporting feature can be accessed via the Social Interaction Screen on the Pause Menu or pressing P, its default keybind.
There, you can select multiple messages to report along with the surrounding context of the message.
After selecting the messages, the reporter will then select a category for the report. There are several categories, and they are listed below:
Imminent Harm—Self-harm or Suicide
-Someone is threatening to harm themselves in real life or talking about harming themselves in real life.
Child Sexual Exploitation or Abuse
-Someone is talking about or otherwise promoting indecent behavior involving children.
Terrorism or Violent Extremism
-Someone is talking about, promoting, or threatening with acts of terrorism or violent extremism for political, religious, ideological, or other reasons.
-Someone is attacking you or another player based on characteristics of their identity, like religion, race, or sexuality.
Imminent Harm - Threat to Harm Others
-Someone is threatening to harm you or someone else in real life.
Non-consensual Intimate Imagery
-Someone is talking about, sharing, or otherwise promoting private and intimate images.
Harassment or Bullying
-Someone is shaming, attacking, or bullying you or someone else. This includes when someone is repeatedly trying to contact you or someone else without consent or posting private personal information about you or someone else without consent (“doxing”).
Defamation, Impersonation, and False Information
-Someone is damaging someone else's reputation, pretending to be someone they're not, or sharing false information with the aim to exploit or mislead others.
Drugs or Alcohol
-Someone is encouraging others to partake in illegal drug-related activities or encouraging underage drinking.
Once a category is selected for the report, additional comments can be included to provide more details and information about the report made. Evidence of authenticity of the reported secure chat is also included with the report to further validate the report created.
What Happens Next?
According to an article made by Mojang at their website, their team reviews the reports, checking not only the reported messages but also the context surrounding the reported message. This method allows a thorough and more detailed idea of what happened and why it is happening, allowing their team to react appropriately. Occasionally, the account reported can be suspended temporarily or in extreme cases, permanently.
It could possibly be the very same moderation team Mojang uses on Bedrock Edition, or could be a new batch or set of moderators trained by the moderators from Bedrock. This means that there will be an actual human intervention and input during the process of reviewing, making false bans minimal, if not completely eradicated.
What happens if there is a false ban?
Being falsely banned for something you didn’t do sucks, so Mojang already implemented a new web page on their site to appeal for a case. By visiting and filling up the form on this page, you can appeal for a Case Review for your ban. Here, a support ticket will be generated where you can have a conversation with their staff and appeal for the ban to be lifted.
However, the case review can only be submitted if your account has an active enforcement of over 24 hours, and was issued within the last 12 months. This was specifically written at the bottom of their Community Standards page. The community standards are also the general rule they’ve implemented when playing and using their online services such as featured servers on Bedrock and Realms.
What if the Report System is Abused?
In the same article, they said that the reporting players are responsible for the reports they submit. Therefore, intentionally submitting incorrect reports, excessively sending irrelevant reports, or abusing the system can lead to actions taken on the reporter.
This could be a good counter action for those looking to abuse and ban other players for their own personal gain, and potentially scare those who are aiming to abuse it from the get-go.
It seems the community is split in half about the new feature. Some say that it is actually good and can now prevent those predators and scammers from further causing damage. Others say that this will be the downfall of the game thanks to possible incorrect actions taken and potentially destroying server communities.
The Anarchy Community particularly has the most negative feedback, as this can genuinely destroy their community. One example of a community is the 2 Builders 2 Tools or popularly known as 2B2T. One of the oldest anarchy servers in Minecraft to date with over a decade of anarchical history. The players there don’t have any set of particular rules except to not backdoor the server and prevent it from lagging into oblivion. In there, players can grief or destroy builds, use hacked clients, use foul languages, and many other more that can be subject to a long temporary ban or even permanent ban if reported. And with the addition of reporting features, many of them can truly be banned forever, resulting in fewer players in anarchy servers and thus, the collapse of their community. Of course, they can remain in older versions, but this isn’t a viable option as these servers need to update to have a healthy player population.
Opinion on this matter…
Personally, I have played Minecraft since 2013 and had fun playing the game. Sometimes I get killed over and over and even have spat with another player over Factions’ territories. Those experiences were all part of the game, and I don’t hate it. It is those experiences that allow me to gain new friends and even experience the real world in a way. However, there are some bad experiences I may never have, which can make me biased towards this topic.
For me, I would rather not have this feature as it can be used to take the fun away from servers. The chatting and trash talking pumps you up, but it can still be moderated by admins and moderators. As someone who had hosted a server before, I know that there are plugins such as EssentialsX, that help in moderating servers easier, thanks to its features such as back logging chats and even tapping to private messages and emails (in-game). It may be imperfect, but this limits the damage of banning players to a single server and not the entire Multiplayer feature (where if banned, you can’t even join LAN games anymore).
Furthermore, in FitMC’s video, he also said that there are even platforms where if you’re banned, you can’t access your single-player worlds anymore, rendering the game completely unplayable. YouTubers may get reported while making content by either haters or random players that think they are an impersonator. Cracked servers especially, where players without an account can join using a cracked launcher that uses famous YouTubers' usernames, which then can affect those who own the account.
This reporting system feature will only intensify some players from gatekeeping the game to prevent other players, especially kids, from joining and using the system. It may even lead to players completely leaving the game after being banned for false reports or fabricated chats by these bad actors.
The report feature system is a good concept & idea and may truly prevent bad actors from exploiting or abusing and making the game unwelcoming, but it can also lead to many accounts being banned, including an exodus of veteran players. But with how it is primarily used for servers, the feature is somewhat unnecessary. The game is already receiving backlash from the recent update, and now with this feature, Mojang and Microsoft will definitely receive more backlash the same way they received them after implementing the same feature on Bedrock Edition.
Ignoring my opinions though, what are your thoughts for this new feature in the game? Will it improve the Minecraft community and environment, or would it lead to a mass banning and quitting of players?
this system is poopshit