They traveled for hours on end for a cable car ride in a place where such a thing does not exist

They traveled for hours on end for a cable car ride in a place where such a thing does not exist

A couple traveled excitedly for a few hours to take a cable car ride to the top of the mountain. The idea appealed to them after watching a video narrated by a TV journalist from Malaysia showing smiling tourists who were thrilled with this cable car. Unfortunately, it turned out they had been deceived.

Upon arrival, the couple found themselves in an insignificant town, and the locals had no idea which cable car they were talking about. The video had been generated by artificial intelligence, and the two believed it was real.

The incident, detailed in an article published by Fast Company, seems to be unique, but everyone will need to take it into account when searching the internet for things they want to buy or places to visit, notes TechRadar.

A small logo in the corner of the video indicates that the material was created with Veo 3, Google's latest AI-based video engine. And it's not the only sign that the video is created with AI: the appearance of people and structures has that specific unreal glow of artificial intelligence. However, if you don't have experience with deepfakes or don't look for specific AI signs, you might not realize it and find it absurd to be suspicious of a well-made tourist video.

Apakah benar Kabel car di Pengkalan Hulu & Unbelievable cable car at Pengkalan Hulu Perak

However, the new reality we will have to get used to is that artificial intelligence can now sell us not only a product but also a place to go, and that place may never have existed. The usual signs of deceptive material - slightly misspelled words and suspicious URLs - are nothing compared to the aforementioned tourist clip. It wasn't even clear whether it was malicious or just someone's misguided attempt to create content.

It's easy to roll your eyes and say it will never happen to you. But we all have some weaknesses, and artificial intelligence is becoming increasingly adept at targeting them.

Video editing with AI has evolved so much that we can hardly distinguish what is real from fake content. Therefore, we need to pay special attention to identifying the signs of AI-generated content.

This doesn't mean we should abandon all travel plans, but the average person now needs a new kind of knowledge that must be calibrated not only to detect Nigerian letter frauds and cryptocurrency traps but also to illusions created by video and travel influencers using AI.

Furthermore, we must also consider false presentations of real places, with review sections flooded with fake testimonials created by artificial intelligence about things that don't exist, the publication mentions.

It's a challenging process that may involve being suspicious of things that seem too good to be true. You may need to check multiple sources to see if everyone agrees that something is real. You might need to do an image search or search on social networks. And when it comes to images and videos, you need to make sure they're not flawless. If no one is frowning, coughing, or sneezing in a photo with many people, you should be cautious - it's all too good to be true.

It's regrettable that we end up doubting a beautiful place we see, instead of immediately making plans to go there. But perhaps this is the price of living in a world where anyone can create illusions about anything. Therefore, you will need to put in more effort to ensure you are heading to a real place, not to an illusion created from pixels and algorithms, concludes the publication.

T.D.


Every day we write for you. If you feel well-informed and satisfied, please give us a like. 👇