Please use this identifier to cite or link to this item: https://scholar.ptuk.edu.ps/handle/123456789/308
cc-by
Full metadata record
DC FieldValueLanguage
dc.contributor.authorDaraghmi, Eman-
dc.contributor.authorGhanayem, Mais-
dc.contributor.authorMasarweh, Nour-
dc.date.accessioned2019-04-20T16:16:48Z-
dc.date.available2019-04-20T16:16:48Z-
dc.date.issued2017-
dc.identifier.urihttps://scholar.ptuk.edu.ps/handle/123456789/308-
dc.description.abstractThis project introduces a mobile software system for indoor and outdoor environments called “MASA”. Our proposed system provides indoor and outdoor location detection and tracking based on a combination of image marker recognition and inertial measurements. This localization service is core to deliver a context-aware information system provided via augmented reality. Augmented Reality (AR) is a live, direct or indirect, view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data. Our proposed system meets the highest standards and provides the most rewarding environments for users seeking illustrative information upon sightseeing and visiting new places. MASA provides important functionalities in order to answer two main questions, where am I and what information can I know about this place. Our project was tested in the faculty of Science and Arts at Palestine Technical University – Kadoori. Our proposed system consists of three main component: Vuforia SDK, location manager, and communication module. The user localization process starts with the video capture from the smartphone camera. The video frames are analyzed in real- time to detect and track well-known markers. These markers must correspond to high-contrast planar objects that are part of the landscape, and are thus assumed to be static. The image detection process is accomplished with Vuforia SDK, which delivers the distance and 3D orientation of the image marker. The target orientation and posture are further used by the AR layer to overlay 3D virtual objects that merge seamlessly with the scene in the video capture. Simultaneously, the location manager fetches from the local database the information related to the detected marker, such as its location, hence implicitly obtaining the current location of the student using the service. Indispensable in this process is the communication module, which connects to the service server via a TCP socket, so insuring that the local database is up to date. Our proposed system provides two main localization services: 1) The Indoor Localization, and 2) the Outdoor Localization function. The indoor localization function starts when a student put his/ her smartphone in front of any object inside a building, such as a wall, aisle, or a door. The function detects markers and provides information regarding the view being recognized. For example, if the detected object was a class room, the function provides information such as, the current lecture running in a class, information about the lecture, the subject that is being taught inside the class, the teacher who gives the lecture, and the lectures that will be given during the day. If the detected object was an office room, the function shows information about the professor set inside that office, his/her office hours, the courses taught by him, his research interest, and his contact information. The outdoor localization function was implemented as an overlay on the outside building. The student puts his/her mobile in front of a building and the application shows the main offices resides inside that building.en_US
dc.publisherURC2017en_US
dc.titleMASA: Context Aware Visual Indoor & Outdoor Augmented Reality for a University Campusen_US
Appears in Collections:Applied science faculty

Files in This Item:
File Description SizeFormat 
masa.png3.54 MBimage/pngThumbnail
View/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.