SemanticMarker 4+

IoT Optical Vision Marker App

Scott Moody

Designed for iPad

    • $9.99

Description

Kona Currents, LLC offers a unique Remote Education and IoT processing app for creation, collection, processing, search, and messaging of optical visual markers we call Semantic Marker (R). An innovative text and optical search engine is included, along with an API for processing optical labels, barcodes and two-dimensional codes. These optical image and data files are downloadable for internal and external use. This app interacts through published API's with Cloud Computing services. The app will communicate via MQTT messaging among a suite of IoT devices, as well as using Bluetooth Low Energy (BLE) directly to devices, which is especially useful to set credentials such as WIFI passwords for the IoT devices (ie. ESP 32 devices). See SemanticMarker.org for more on the abilities use. - We have one user using Eye Tracking to send messages to reward his service dog over our IoT framework! #Ability

Now for Summer 2025, is the support for Space and Time of your Semantic Marker(R). This includes GPS and Time that is extracted from the meta-data of your markers. The Map will now show the markers in their geospatial location. See the map above where the optical visual markers, which represent music artifacts, are placed at where the concert was recorded. The date of the events are also tagged - and stored in the photo for sharing outside the Semantic Marker(R) framework. Now think of real-estate or other location based artifacts: tag with a Semantic Marker(R) and the marker is much more valuable.

What’s New

Version 7.5

An exciting new Semantic Marker(R) now really supports Space and Time: your own SMART Button can be tagged with geospatial (GPS) values so they show up on map products at the desired location. You can also enter time values in the past, to reflect the SMART Memory, and also into the future. This will be a future triggering mechanism. But what's super powerful is that a Semantic Marker(R) can be sent through the air, maybe in a paper airplane across the late Kingdome (with no electronic tracking of that message). Then re-digitized on the receivers end - and they will get the same location and date: Space and Time.

So a high school reunion event, or photos from 50 years ago, are tagged with the correct location and time. Then the app can show a map, but also just photos saved to a users smart phone photo library, will have those values in the GPS and EXIF photo values. Thus outside of my app, you can get mapping and time features - since exported SMART Buttons will include those values. Truly portable (outside my app).

Triggers are also incorporated better. This now supports Location Triggers: the ability for the GPS mentioned to be triggered if a user is within a distance of that Semantic Marker(R). This is without bluetooth tags, and instead using the GPS of the phone. Other triggers are based on MQTT set JSON values. Thus a trigger might be for the feedCount to be greater than 100. Currently a DOCFOLLOW message is sent.
Example Apple map of Grateful Dead albums mapped to their concert location, and a timestamp of event. Again this is outside my tool once my app exports the markers as optical or visual markers. https://photos.smugmug.com/Groups/KnowledgeShark/SemanticMarkerUserManual/i-nJ7q6SN/0/LxzwwNJr5RtSCHpQJBWdMWG6jB6Bn3474JFvBQTvn/X2/IMG_4707-X2.png

Some tools and web sites strip the location and time, as an example with the following (which will take the user to a SMART Music site for a 1973 Paris Grateful Dead concert.
https://semanticmarker.org/music/May2_72.PNG

TRON - the SMART Memory Stamp is also powerful. One can now wrap TRON in a Semantic Marker(R) so they are stored (and can change).

I really like the SMART Music aspect of the Semantic Marker(R) app. Here you can point to music artifacts on your phone, and create a SMART Button with the image of that music and links to the requested music.
But what's even more interesting, is that the archive.org open source streaming site, can now be referenced and my code will extract the playlist including the song names and a link to the mp3 of said song. This is placed in a SMART Button in the artifact field with @playlist and @artist fields.

As mentioned above, a MAP can show where a Semantic Marker(R) selection exists in Space and Time. But also photo images stored on your phone can be used in a similar manner (mapping where they are). Think of a relator with locations of all their houses. Now the Semantic Marker(R) will have that GPS information, as well as time, and any artifacts with the SMART Button itself (the web page for the artifact.) Note the Semantic Marker(R) offers a 'markup' field that will result in a web page being rendered for the user - without their own web page storage. The potential for getting use counts from random potential clients (say at the Puyallup Fair) will also be introduced soon.

I've collected some videos that loosely form the user manual:
https://www.smugmug.com/app/organize/Groups/KnowledgeShark/SemanticMarkerUserManual

Let me know if you want more information on a new way to process our information and our memories. We coined "Grandmas Attic" as one application. This is where the physical and virtual world overlap.

App Privacy

The developer, Scott Moody, indicated that the app’s privacy practices may include handling of data as described below. For more information, see the developer’s privacy policy.

Data Not Collected

The developer does not collect any data from this app.

Privacy practices may vary, for example, based on the features you use or your age. Learn More

Supports

  • Family Sharing

    Up to six family members can use this app with Family Sharing enabled.

You Might Also Like

Siren Connected Boat
Productivity
Merge PDF - Combine PDF
Productivity
OneCalc: All-in-one Calculator
Productivity
Quicken Classic
Productivity
Quantum Fiber
Productivity
BlueParrott
Productivity