Kona Currents, LLC Privacy Policy for
Semantic Marker®
Innovative Optical Visual Marker Processing
We Collect No Personal Information Using Our Applications
-
We do not collect, use, save, or have access to any of your personal data recorded by SemanticMarker for iOS, Mac or Apple TV.
-
Individual settings relating to the SemanticMarker apps are not personal and are stored only on your device. You might also be asked to provide access to your photo library, but this is only so you can open your photos in SemanticMarker and save them back to your library. We don't process that information at all and have no access to it.
Data Retention Policy
-
The SemanticMarker for iOS, Mac or Apple TV does not retain any personal data.
Location Data Use
-
Location data requested from SemanticMarker for iOS is used to find the WIFI networks available for your use. The WIFI is only briefly used (and the blue arrow upper left of iPhone will show it's short use). (The Look button is used to show this feature).
-
Location data (if allowed by the user) is stored by the camera app with the photo (just like the default iOS camera app).
We have no access to that data and it is not shared with anyone unless you yourself choose to share the image with location sharing enabled.
-
Location data is NOT used by the Map window on iOS.
Bluetooth BLE Interface
-
The SemanticMarker for iOS will interface with local Internet of Things (IoT) devices using the Bluetooth BLE interface. These are to interact with the devices sending login credentials and various other messages. No personal data is used.
- The main BLE actions include:
- Show the discovery of the ESP-32 devices (subset of all BLE devices)
- Show binding to one of them
- Show "feed" button and the device turning (feeding the dog)
- Show the WIFI credentials that are sent to the device
- Optionally show the Semantic Marker being scanned and feeding the dog
Audio Use
-
The SemanticMarker for iOS uses audio services in a few ways:
- Playing Video and Audio
- Video and Audio artifacts that are stored on the users iOS are played if selected. They also
play in the background without the iOS app being open. The lock screen will show the artifact just
like the music app.
- Video and Audio artifacts from external web addresses can also play in a similar manner. The
user can not manually enter a web address, just like the embedded web browsing doesn't allow web address to be entered
(thus supporting the 4+ year old permission).
- Camera Apps use of audio
- The Camera app will use audio in live and video capture mode. That audio is stored with the photo artifact in the users photo album (if photo library access is granted.)
- Speech Recognition (using microphone)
- Speech Recognition is available while using the Camera feature. The upper right button shows a microphone. Clicking will
use the microphone of the iOS device and listen for known spoken words. Currently only "Feed the Dog" is supported
and it will call the appropriate code to perform that action, while showing a status saying the command completed. Currently
this speech recognition is one at a time, meaning once it performs the recognition, it's out of that listening mode. The user
must click the button again to go into recognition mode.
- Speech (over iOS audio) is briefly used
- Speech Utterance is briefly used to explore interacting with users via speech. Currenly this is used
in a single location when the user selects "Speak To Me" - and a single phrase is output through the users speaker.
Dynamic or user created phrases are not supported.
- Picking songs and getting their addresses (Apple Music Usage)
- This mode lets the user select from their own media items (their own songs on their phone).
The result is a set of ID's that point to songs on their phone. Later that set of songs can be
added to the users playlist. Playing the music is via the users music app (not this app).
If the user shared this list, it wouldn't be usable as the ID's are totally unique to the
users' individual iOS device including any metadata about the song. So no privacy or personal information
is sharable (such as personal music choices.)
Third Party Services/SDKs over WIFI
-
The SemanticMarker for iOS, Mac or Apple TV will interface over WIFI in the following ways:
-
Cloud Computing web services running at SemanticMarker.org or a configurable location.
This includes background retrieval of new teaching material, we call a Wave, that is used by the Knowledge Shark
training section of the SemanticMarker® app.
-
MQTT pub/sub messaging protocol running at SemanticMarker.org or a configurable location.
-
The internal web browser will also interact with external web servers over the HTTP web protocol. The web address cannot be manually entered by the user, instead external tools can remotely create artifacts that can be browsed by the SemanticMarker® app.
Modified: 10.24.23