Facebook has announced the latest version of its successful standalone virtual reality (VR) headset, the Oculus Quest 2. The new device packs more computing power and a sharper screen than its predecessor, and is also US$100 cheaper.
The Oculus Quest 2 is the latest step in Facebook’s long-term strategy of making VR more accessible and popular. Facebook recently brought all its VR work under the umbrella of Facebook Reality Labs, it has announced new applications like the Infinite Office VR workplace, and will also require a Facebook login for future Oculus devices.
The compulsory link to Facebook has many consumers concerned, considering the social media giant’s chequered history with privacy and data. VR and its cousin, augmented reality (AR), are perhaps the most data-extractive digital sensors we’re likely to invite into our homes in the next decade.
Why does Facebook make virtual reality headsets?
Facebook acquired VR company Oculus in 2014 for an estimated US$2.3 billion. But where Oculus originally aimed at gamers, Facebook boss Mark Zuckerberg wants VR for social media.
At the same event last year, Zuckerberg said Facebook sees VR as a pathway to a new kind of “social computing platform” using the enhanced feeling of “presence” that VR affords. For Facebook, the introduction of VR-based computing will be like the leap from text-based command line interfaces to the graphical user interfaces we use today.
This may well be right. VR affords a strong feeling of embodied presence that offers new possibilities for entertainment, training, learning and connecting with others at a distance.
But if the VR future is the one Facebook is “working in the lab” on, it will function via the company’s existing social computing platform and business model of extracting data to deliver targeted advertisements.
Virtual reality collects real data
A VR headset collects data about the user, but also about the outside world. This is one of the key ethical issues of emerging “mixed reality” technologies.
As American VR researcher Jeremy Bailenson has written:
…commercial VR systems typically track body movements 90 times per second to display the scene appropriately, and high-end systems record 18 types of movements across the head and hands. Consequently, spending 20 minutes in a VR simulation leaves just under 2 million unique recordings of body language.
The way you move your body can be used to identify you, like a fingerprint, so everything you do in VR could be traced back to your individual identity.
Facebook’s Oculus Quest headsets also use outward-facing cameras to track and map their surroundings.
In late 2019 Facebook said they “don’t collect and store images or 3D maps of your environment on our servers today”. Note the word today, which tech journalist Ben Lang notes makes clear the company is not ruling out anything in the future.
Virtual reality leads to augmented reality
Facebook wants to collect this data to facilitate its plans for augmented reality (AR).
Where VR takes a user to a fully virtual environment, AR combines virtual elements with our real surroundings.
Last year Facebook unveiled the Live Maps application, a vision of an expansive surveillance apparatus presumably powered by AR glasses and data collected from Oculus Insight. Live Maps will provide many minor conveniences for Facebook users, like letting you know you’ve left your keys on the coffee table.
Now Facebook have announced their first steps towards making this a reality: Project Aria. This will involve people wearing glasses-like sensors around Seattle and the San Francisco Bay area, to collect the data to build what Wired co-founder Kevin Kelly calls “the mirrorworld”, the next big tech platform.
People are rightly concerned about the ethical implications of this kind of data extraction. Alongside Project Aria, Facebook launched its Responsible Innovation Principles page, and they’re already quick to emphasise that faces and license plates will be blurred in this data collection.
As we have argued elsewhere, framing questions about VR and AR surveillance in terms of individual privacy suits companies like Facebook very well. That’s because their previous failings are actually in the (un)ethical use of data (as in the case of Cambridge Analytica) and their asymmetric platform power.
We need more than just ‘tech ethics’
Groups like the XR Safety Initiative recognise these emerging issues, and are beginning work on standards, guidelines and privacy frameworks to shape VR and AR development.
Many emerging technologies encounter what is known as the Collingridge problem: it is hard to predict the various impacts of a technology until it is extensively developed and widely used, but by then it is almost impossible to control or change.
We see this playing out right now, in efforts to regulate Google and Facebook’s power over news media.
As David Watts argues, big tech designs its own rules of ethics to avoid scrutiny and accountability:
Feelgood, high-level data ethics principles are not fit for the purpose of regulating big tech … The harms linked to big tech can only be addressed by proper regulation.
What might regulation of Facebook’s VR look like? Germany offers one such response – their antitrust regulations have resulted in Facebook withdrawing the headset from sale. We can only hope the technology doesn’t become too entrenched to be changed, or challenged.
But regulation has not always stopped Facebook in the past, who paid out US$550 million to settle a lawsuit for breaching biometric privacy laws. In the multi-billion dollar world of big-tech, it’s all a cost of doing business.
Another question we might ask ourselves is whether Facebook’s virtual-reality future and others like it really need to exist. Maybe there are other ways to avoid forgetting your keys.
Digital Banking Goes Passwordless
Quarantine protocols during the COVID-19 pandemic have limited in-person activities, causing the rapid increase in digital financial services. With many new users signing up on their platform, digital banks and financial technology (fintech) firms must ensure that only legitimate clients can utilize their services. Moreover, they must also deliver a frictionless and hassle-free experience to meet the changing demand of modern consumers.
Still, various financial institutions continue to rely on conventional identity verification methods like passwords to prevent unauthorized persons from using someone else’s account. However, this outdated authentication process does not confirm that the person is the real account owner. It only verifies that they know the login credentials.
Once a criminal succeeds in taking over a customer’s account, it can lead to significant financial losses and other costs like reputation damage and customer dissatisfaction.If financial institutions want to go truly digital, they must shore up their digital defenses with passwordless login compliant with FIDO2 authentication standards to curtail fraudulent attacks while ensuring customer satisfaction.
FIDO2’s multi-factor authentication (MFA)utilizes a combination of other login credentials that bind identity to the devicelike inherence factors (biometrics) and possession factors (cryptographic keys stored on a registered device). Some also use knowledge factors like a stored pattern swipe. Unlike passwords, these authentication factors are not easily circumvented or stolen.
During account enrollment, customersopt for passwordless login.They can simply use their camera-enabled mobile device to take a selfie and scan an identity document to complete the facial biometric identity proofing procedure. They will also undergo an active liveness confirmation to determine their real-time presence.In under a few seconds, the customer’s identity is confirmed and cryptographic keys are established, creating a digital chain of trust.
Going passwordless enables digital financial services providers tokeep up with the fast-changing industry trends and practices. For more information on the matter, see the following infographic from authID.
What is Public Cloud and how it Works?
Nowadays, technology is increasing day by day in every field that provides a good opportunity to the person to get the goal. People live a happy life with family and friends without any problems. These days most people want to do business whether it is small or large, the technology helps them to do their best in the respective field. Today people will find out the basic fact of the public cloud.
What is it or how it is beneficial for people?
Overview of Public cloud
- This is a type of computing, in which the server provides the client the resources that are available in public through the internet. These resources are varied by the provider while may include the application, storage capacity, or the virtual machine. This cloud may permit scalability and aids sharing that would not otherwise be possible for the single group to attain.
- Some public clouds maybe offer resources for free to the client, while the client pays for the other resources through the subscription or pay per usage model. The cloud services are available to the person user and prices depend on the aids needs. Some groups with a huge amount of data desire to develop the cloud migration method before choosing the cloud vendor.
Public cloud security:
- Nowadays, the modern server provides and takes security very seriously. WehaveServers.com is useful for specialized security personnel to automate the safety function and monitor the system for abnormalities. A strict policy applies to protect the data of the person from being assessed by other cloud tenants.
- To improve access to the development of the level of security, the organizations can power the public cloud solution within a hybridization environment.
Public cloud is an alternative application improvement method to traditional on the premises IT architecture. In the public cloud computing model, the third-party provider host scalable, the demand of IT reserves or helps to deliver them to the clients. The public cloud provider supplies the infrastructure that is needed by the client to host or deploy the workloads in the cloud.