By Mitra Sorrells – BizBash
From AR to VR and N.F.C. to R.F.I.D., here is the tech terminology that’s shaping the future of conversations around meetings, trade shows, conferences, fund-raisers, and more.
The glossary of must-know terms for planners has been growing in recent years. The invention of new apps, software, hardware, and technology products—including many that seem like something from a sci-fi movie and were unimaginable just a few years ago—is not only disrupting event design and the attendee experience, it’s also creating a new lexicon for the profession.
Here’s our list of 12 essential tech terms for planners. Chances are you have already heard, and experienced, many of these. And if you haven’t, we bet you will in the near future.
Chatbots are interactive communication tools that use artificial intelligence to automate the process of responding to common questions from guests. Rather than assigning a staff person to monitor and answer questions such as “What time is dinner?” or “Where is the networking event?” a chatbot can receive and answer these questions via text, Facebook Messenger, or in an event app.
Virtual Reality puts the user in a computer-generated environment. Participants don a headset or look through a handheld viewer to experience the virtual world, which may include images, sounds, and sensations to create the feeling of being inside that virtual space.
Augmented Reality superimposes graphics, sounds, videos, and more to the user’s view of reality. Unlike VR, which is completely immersive, AR simply enhances, or augments, a real setting.
R.F.I.D. and N.F.C. are related, but not identical, wireless communication systems that use radio waves to transmit information between tags and readers. N.F.C. is a type of R.F.I.D. that is used for close-range communication, such as tapping a bracelet to an exhibitor’s display to receive product information or tapping badges with a fellow attendee to exchange contact information. Traditional R.F.I.D. is effective at longer ranges, so it can be used for purposes such as attendee tracking and access control, without requiring the attendees to take any action.
Geofence refers to a virtual boundary that is created around a real-world location. The system uses GPS or R.F.I.D. to identify when mobile devices are within that boundary and to trigger communication to, or monitoring of, those devices. Planners can use it to create hyperlocal experiences, such as sending a coupon code to attendees when they are near concession stands, and to track engagement within a defined area.
Biometric Data can be gathered via a wearable device, such as a wristband, that measures data such as a person’s movement, skin temperature, heart rate, and more.
Beacons are small, wireless devices which transmit information that can be received by smartphones, tablets, lead retrieval devices, and more. At events, beacons can be used to automate the check-in process, to track attendee movement and dwell time, to share exhibitor or sponsor information with guests, and to assist with wayfinding.
Retargeting is a strategy to reach people who have visited your website but not taken the action you desire, such as registering for your event or buying your product. Planners can also offer retargeting services as a benefit for sponsors, so registered attendees see ads from the event’s sponsors in the weeks prior to the event.
Ultrasonic Beacons transmit tones that can be picked up by a smartphone’s microphone. The tones are inaudible to the human ear and can be used in a variety of ways at events, including ticketing, wayfinding, scheduling, and more.
Big Data refers to the large volume and complexity of data that is generated by every action we take online, and in some cases even when we’re offline. When people visit websites, communicate on social media, carry GPS-equipped smartphones, check in to venues, etc., those actions create “digital footprints” that can be analyzed to influence future decisions and, in the case of events, better attendee experiences.
Emotion Analysis is done with facial recognition software and a webcam to analyze the expressions on a person’s face. Using intelligence culled from thousands of existing pictures of faces, the algorithm analyzes and interprets parts of the user’s face, such as the corners of the mouth or the position of the brow, and that information is linked to emotions.