Araştırma Çıktıları | WoS | Scopus | TR-Dizin | PubMed

Permanent URI for this communityhttps://hdl.handle.net/20.500.14719/1741

Browse

Search Results

Now showing 1 - 10 of 13
  • Publication
    Physical layer authentication for extending battery life
    (ELSEVIER, 2021) Ayyildiz, Cem; Cetin, Ramazan; Khodzhaev, Zulfidin; Kocak, Taskin; Soyak, Ece Gelal; Gungor, V. Cagri; Kurt, Gunes Karabulut; Bahcesehir University; Oklahoma State University System; Oklahoma State University - Stillwater; University of Louisiana System; University of New Orleans; Abdullah Gul University; Universite de Montreal; Polytechnique Montreal
    Increasing population density in cities, and the increasing demand for efficiency in resource usage call for architectures enabling smart cities, such as the Internet of Things (IoT). In most such scenarios, the data generated by IoT sensors is not confidential, but its integrity is critical. Data integrity can be achieved by establishing certification mechanisms that provide cryptographic message authentication protocols, however, this requires relatively expensive components for storing and processing the encryption key on the sensor and consumes more power while processing and transmitting data, which leads to the renunciation of security issues in cost sensitive deployments. In this paper, we propose a security solution that provides data integrity without draining the batteries of IoT sensors. Our solution consists of, (i) differentiating legitimate sensors by taking advantage of their impurities formed during the manufacturing process of the transceiver components, and (ii) eliminating the complex components that carry out cryptography as well as the redundant packet header fields, thereby yielding power savings. The testbed implementation of the proposed solution yields power measurement results providing an estimate of 2.52 times improvement in battery life without compromising the integrity of communications in the system, in addition to offering an increase in spectral efficiency and a decrease in the overall IoT device cost.
  • Publication
    SCORING: Towards Smart Collaborative cOmputing, caching and netwoRking paradIgm for Next Generation communication infrastructures
    (IEEE, 2022) Hmitti, Zakaria Ait; Ben Ammar, Hamza; Soyak, Ece Gelal; Kardjadja, Youcef; Malektaji, Sepideh; Ali, Soukaina Ouledsidi; Rayani, Marsa; Saqib, Muhammad; Taghizadeh, Seyedreza; Ajib, Wessam; Elbiaze, Halima; Ercetin, Ozgur; Ghamri-Doudane, Yacine; Glitho, Roch; University of Quebec; University of Quebec Montreal; Sabanci University; Concordia University - Canada; Bahcesehir University
    The unprecedented increase of heterogeneous devices connected to the Internet, along with tight requirements of future networks, including 5G and beyond, poses new design challenges to network infrastructures. Collaborative computing, caching and communication paradigm together with artificial intelligence have the potential to enable the Next-Generation Networking Infrastructure (NGNI) that is needed to fulfill the stringent requirements of emerging applications. In this paper, we propose the SCORING project vision for reshaping the current network infrastructure towards an NGNI acting as a truly distributed, collaborative, and pervasive system that enables the execution of application-specific tasks and the storage of the related data contents in the Cloud-Edge-Mist continuum with high QoS/QoE guarantees.
  • Publication
    A survey on integrated computing, caching, and communication in the cloud-to-edge continuum
    (ELSEVIER, 2024) Maia, Adyson; Boutouchent, Akram; Kardjadja, Youcef; Gherari, Manel; Soyak, Ece Gelal; Saqib, Muhammad; Boussekar, Kacem; Cilbir, Idil; Habibi, Sama; Ali, Soukaina Ouledsidi; Ajib, Wessam; Elbiaze, Halima; Ercetin, Ozgur; Ghamri-Doudane, Yacine; Glitho, Roch; University of Quebec; University of Quebec Montreal; Bahcesehir University; Sabanci University; Concordia University - Canada
    Cloud and edge computing have proposed different functionalities to enable multiple applications requiring different communication, computing, and caching (3C) resources. The upcoming futuristic applications (e.g., metaverse, holographic, and haptic communication) impose further stringent requirements (e.g., ultra-low latency, ultra -high reliability) on the infrastructure. These requirements call for a paradigm shift in the infrastructure architecture where all resource components and owners collaborate from the cloud up to the edge, creating a cloud-to-edge continuum of integrated resources. Furthermore, we argue that artificial intelligence (AI) and collaborative-based decisions are promising techniques to efficiently manage the highly complex architecture that jointly leverages 3C in the continuum. This article presents a comprehensive survey of existing research, including AI and collaborative-based studies, targeting the effective and seamless provision of 3C resources and services in the cloud-to-edge continuum. Through an extensive analysis of driving use cases, the synergy between these three main services is scrutinized to highlight its crucial role in the nextgeneration network infrastructures (NGNI). Finally, a discussion on the opportunities and challenges brought by integrating 3C in NGNI from different perspectives, including architectural design as well as the regulatory and business aspects, are presented.
  • Publication
    Neural Network-Based Human Detection Using Raw UWB Radar Data
    (IEEE, 2024) Dogan, Emine Berjin; Yousefi, Mohammad; Soyak, Ece Gelal; Karamzadeh, Saeid; Kolosovs, D; Bahcesehir University; Bahcesehir University; Bahcesehir University; Bahcesehir University
    Ultra-Wideband (UWB) radar technology is a widely used technology for human detection and tracking through walls, because of its effectiveness in low-visibility situations. This study demonstrates a neural network-based identification of human presence using raw data obtained directly from the UWB radar. First, measurements have been collected with different human subjects at different positions relative to the UWB radar. A convolutional neural network (CNN) model has been trained on this dataset, to detect the presence of a human. Next, the algorithm effectiveness is deeply investigated using the Gradient-weighted Class Activation Mapping (Grad-CAM) method, and the observations on detected presence are discussed.
  • Publication
    Improving the Robustness of CNN-Based Human Detection in Multiple Raw UWB Radar Datasets
    (IEEE, 2024) Yousefi, Mohammad; Dogan, Emine Berjin; Soyak, Ece Gelal; Karamzadeh, Saeid; Bahcesehir University; Bahcesehir University; Bahcesehir University; Bahcesehir University
    Ultra-wideband (UWB) radar technology has gained significant attention for human detection, vital sign monitoring, and activity recognition applications. In this study, a robust deep-learning model capable of detecting human presence in different environments is developed. The model uses only the raw data obtained directly from the radar without preprocessing. The model is trained and tested on different datasets collected via the UWB radar. Transfer learning has been applied to fine-tune the model on datasets to improve the performance and generalization of the model. This study demonstrates the effectiveness of transfer learning in adapting UWB radar-based human detection models to different environments.
  • Publication
    Advancing WebRTC QoE Assessment with Machine Learning in Real-World Wi-Fi Scenarios
    (IEEE, 2024) Argin, Berke; Demir, Mehmet Ozgun; Salik, Elif Dilek; Onalan, Aysun Gurur; Batum, Oyku Han; Soyak, Ece Gelal; Bahcesehir University
    Video conferencing applications play a key role in enabling use cases like remote working, education, and potentially the metaverse. From the perspective of Internet service providers, predicting the end user's Quality of Experience (QoE) in such applications is critical in allocating the right resources to ensure consistently high QoE. This work addresses the estimation of user QoE from link-layer performance metrics such as transferred packets, queue size, signal strength, and channel occupancy for WebRTC-supported applications. Our study entails collecting a data set capturing various Wi-Fi scenarios in practical environments and training machine learning models on this data to estimate the perceived QoE. Our findings demonstrate improvement in prediction accuracy compared to earlier models and QoE representations, furthermore, we also investigate the explainability of the models with the help of SHAP values.
  • Publication
    Effective networking: Enabling effective communications towards 6G
    (ELSEVIER, 2024) Soyak, Ece Gelal; Ercetin, Ozgur; Bahcesehir University; Sabanci University
    The realization of envisioned 6G use cases involving holographic and multi-sensory communications demand terabits per second data rates and latencies in the range of microseconds for an immersive experience. Concurrently, 6G's hyper-intelligent IoT use cases require extremely low-latency and reliable communications at the network edge. To address these requirements, communications should be tailored to end-user goals. To this end, we study communication effectiveness, where a sender and receiver harness their computing capabilities and artificial intelligence, to maximize the impact of transmitted messages while sending fewer bits. On our model, the messages can get shorter as locally accumulated knowledge increases the targeted effect of the message. Hence, we describe a framework in which the accumulated knowledge can be aggregated and shared in a distributed manner. On a real-life use case, we showcase the potential reduction in the number of bits transmitted owing to the transferred accumulated knowledge. Finally, we explore future research directions in effective communications, considering technical, economical, and privacy considerations.
  • Publication
    Faster Wi-Fi Fingerprinting Using Feature Selection
    (IEEE, 2020) Aydin, Hurkan M.; Ali, Muhammad Ammar; Soyak, Ece Gelal; Bahcesehir University; Bahcesehir University
    Wi-Fi fingerprinting has been widely used for indoor positioning, as Wi-Fi technology is easily deployed and supported. In fingerprinting, a database is created using the received signal strength indicator (RSSI) values in the area of interest, position prediction is performed by finding the best match for a measured RSSI among the values in the database. As location positioning gains importance for continuous interactive (CI) applications in large indoor spaces such as malls and airports, the fingerprinting databases become larger, making it computationally more difficult to position targets in real-time. On the other hand, CI applications such as Augmented Reality (AR) require low-latency positioning for a good user experience. In this work, we propose to use feature selection methods along with the K-nearest neighbors (KNN) classification and regression algorithms in order to create a simple and swift location positioning system. Our evaluation of various feature selection methods shows that computation times for positioning can be reduced by 75% using feature selection.
  • Publication
    The Analysis of Feature Selection with Machine Learning for Indoor Positioning
    (IEEE, 2021) Aydin, Hurkan M.; Ali, Muhammad Ammar; Soyak, Ece Gelal; Bahcesehir University; Bahcesehir University
    Indoor positioning is useful in various venues including warehouses, convention centers, malls, airports, nursing homes. In these scenarios, reducing the complexity of location estimation both improves responsiveness and helps to elongate battery life of the mobile device. In this work, we carry out a detailed analysis of the impact of Principal Component Analysis (PCA) on the computational complexity and accuracy with different machine learning algorithms on a large data set containing 520 APs. We compare the algorithms' training and testing times, as well as their accuracies in the presence and absence of PCA. Our results show that (i) PCA significantly reduces both the training and testing times for classification and regression using k-nearest neighbor (kNN) and support vector machine (SVM) algorithms while preserving if not improving accuracy, (ii) PCA slightly improves the training/testing times for regression using multi-layer perceptron (MLP), (iii) random forest (RF) does not perform well with PCA.
  • Publication
    Low-Powered Agriculture IoT Systems with LoRa
    (IEEE, 2020) Kokten, Esma; Caliskan, Bahadir Can; Karamzadeh, Saeid; Soyak, Ece Gelal; Aboltins, A; Litvinenko, A; Bahcesehir University; Bahcesehir University
    Monitoring is key to increase the efficiency of food storage in the open field in terms of cost, logistics and quality of the crops. For the long range data transmission in such environments, mobile technologies are not suitable, as the end devices are generally battery-limited. In this work, a prototype has been developed for monitoring goods in storage. The battery life time of this prototype is analysed in terms of calculations as well as measurements, on LoRa technology. Our results show that (i) while sleeping current has the smallest percentage, it has the greatest impact in increasing battery life, (ii) monitoring node shall have low self discharge battery for long battery life, and (iii) sensors are the main power sink that deplete battery.