DAY 1: Thursday, April 20, 2023 SCHEDULE
8:30 – 9:15
Registration check-in (Location: Hole Academic Centre entrance)
Opening of event (Location: T104)
9:15 – 9:30
Opening remarks (C. Craveiro Salvado)
Greetings from CUE’s President and Vice-Chancellor (T. Loreman)
Session 1: Oral Presentations
Faculties of Arts and Science (Location: T104)
SESSION CHAIRS: M. Golizeh and Isha Katyal (all talks: 15 min each + 5 min Q&A)
9:30 – 9:50
9:50 – 10:10
10:10 – 10:30
10:30 – 10:50
10:50 – 11:10
11:10 – 11:30
11:30 – 11:50
11:50 – 12:10
12:10 – 13:00
13:00 – 13:20
13:20 – 13:40
13:40 – 14:00
14:00 – 14:20
Brady Reid (History)
Kevin St. Arnaud (Psychology)
Kristen Zentner (Psychology)
Break
Deborah Hemmerling (Biological and Environmental Sciences)
Seth Nobert (Mathematical and Physical Sciences)
Nasim Hajari (Mathematical and Physical Sciences)
Kira Sviderskaia (Biological and Environmental Sciences)
Lunch Break (Location: Tegler Centre)
Megan MacElheren (Biological and Environmental Sciences)
Cole Babcock (Mathematical and Physical Sciences)
Md Morshedul Islam (Mathematical and Physical Sciences)
Makan Golizeh (Mathematical and Physical Sciences)
Session 2: Poster Presentations
14:30 – 16:30
Faculties of Arts and Science (Location: Tegler Centre)
Session 3: Networking and Poster Session
16:30 – 18:30
Faculty of Management (Location: Tegler Centre)
SESSION CHAIRS: S. Butakov and E. AbdAllah
DAY 2: Friday, April 21, 2023 SCHEDULE
8:30 – 9:15
Registration check-in (location: Hole Academic Centre entrance)
Opening of Day 2 Event (Location: T104)
9:15 – 9:20
Opening remarks (C. Craveiro Salvado)
Session 4: Oral Presentations
Faculties of Education and Management (Location: T104)
SESSION CHAIRS: E. AbdAllah and T. Fowler (all talks: 15 min each + 5 min Q&A)
9:20 – 9:40
9:40 – 10:00
10:10 – 10:30
10:30 – 10:50
Lilian Behzadi (MISSM)
James Joseph (MISSM)
Oluwseun Bewaji (MISSM)
Ebubechi Okpaegbe (MISSM)
10:50 – 11:10 Break
11:10 – 11:30
11:30 – 11:50
11:50 – 12:10
12:10 – 13:00
13:00 – 13:20
13:20 – 13:40
13:40 – 14:00
14:00 – 14:20
Grishma Raj Gautam (MISSM)
Gayatri Deepak Uttwani (MISSM)
Ramnik Singh Reen (MISSM)
Lunch Break (Location: Tegler Centre)
Sabrina Prova (MISSM)
Kawser Mazumder (MISSAM)
Deepa Thangavelu (MISSM)
Teresa Fowler (Education) 14:30
Closing of Event (Location: T104)
Prizes for student oral and poster presentations
Closing remarks
Avoidance and Exhaustion Oral Presentation Abstract
The Canadian Indigenous group, the Beothuk, is known for their extinction sometime during the 19th century. Despite the Beothuk’s extinction becoming popular amongst researchers during the 20th century, the Beothuk narrative is still plagued with misconceptions. In my research paper entitled “Avoidance and Exhaustion,” I address misconceptions about the interactions between various Indigenous groups and Europeans who interacted with the Beothuk. My research paper will be the basis of my oral presentation. The Beothuk chose to avoid unnecessary interaction with European fishermen and explorers. As a result, it is a challenge to make any conclusive arguments for the causation of the Beothuk extinction or their cultural practices. Researchers agree that documented interactions between the Beothuk and Europeans are faulty. However, they implement the supposed interactions due to the lack of available sources. Publications about the Beothuk have slowed down since the turn of the century. However, recent excavations have provided new evidence. A more accurate portrayal of the Beothuk’s extinction is achievable by relying on archaeological evidence over
Faculty: Department:
personal testimonials from oral folklore. By utilizing archaeological evidence, it is possible to determine the migration patterns of the Beothuk and their attempt to survive on the island of Newfoundland’s resourcepoor interior. I plan to incorporate the migration patterns, Beothuk’s pattern of avoidance, and harsh living conditions into a PowerPoint presentation. The presentation will consist of slides with bullet points, maps, and pictures. The maps will demonstrate the gradual withdrawal of the Beothuk to the island of Newfoundland’s interior. I intend to write a general script that I can follow to ensure I do not go past the 15-minute mark. I will also include pictures of the last known surviving Beothuk (Shawnadithit) and her drawings to demonstrate how little is known about the extinct Indigenous group. The bullet point slides will take up most of my presentation and be used to defend my arguments. I intend to utilize my PowerPoint and talking points to defend my thesis: “The extinction of the Beothuk is a combination of their practice of avoidance, the interior environment of Newfoundland, and Europeans exhausting the land of its resources and animal population.”
ARTS HISTORY
Entheogens and Spiritual Seeking: The Quest for Self-Transcendence, Psychological Well-Being, and Psychospiritual Development
Background and Aims: Although numerous cultures have used psychoactive substances for spiritual, or entheogenic, purposes, little is known about contemporary entheogenic spirituality, particularly outside of the few traditions that retain sacramental drug use practices. Method: To better conceptualize contemporary patterns of entheogenic drug use, an international, online study of entheogenic drug users was conducted (n = 684). Hierarchical regression analysis was used to explore entheogenic drug use in relation to measures of spiritual seeking (importance of spirituality in life, mediation practice, openness to experience), selftranscendent experiences (aweproneness, mystical experiences), psychological well-being (subjective and eudaimonic well-being), and psychospiritual development (quiet ego, wisdom, and spiritual
development). ANOVA was used to compare entheogenic drug users with non-entheogenic drug users and non-drug users to assess differences across these psychospiritual variables. Results: Of the 12 drug categories assessed, the classic psychedelics were most commonly used as entheogens. Entheogenic classic psychedelic use was associated with all of the assessed psychospiritual variables; entheogenic classic psychedelic users showed higher levels of spiritual seeking, self-transcendence, psychological well-being, and psychospiritual development compared to non-entheogenic classic psychedelic users and nonusers.
Conclusion: Entheogenic spirituality may be conceptualized as a practice of spiritual seeking or implicit mysticism–the quest for selftranscendence and personal growth.
Faculty: Department:
PSYCHOLOGY ARTS
Enhanced surveillance for climate change effects on chronic disease in Alberta: A literature review and qualitative summary of stakeholders and vulnerable populations
Background: Careless responding is a threat to the validity of selfreport scores. People experiencing emotional distress struggle with cognitive and motivational decline, which has been correlated with patterns of careless responding. Although several methods have detected careless responses in psychologically distressed respondents, response time has not been widely explored.
Purpose/Aim/Hypothesis: In the current study, it was hypothesized that psychologically distressed respondents (i.e., depression, anxiety, stress) would have higher rates of careless responding, as measured with a response time approach.
Methods: The normative threshold approach, which is based on item response time, was used to identify careless responding and find its association with the emotional distress using the Depression Anxiety Stress Scale.
Results: A significant correlation was found between the number of careless responses and overall distress, r(37576)=.03, p<.001. Independent samples t-tests showed statistically significant differences between careless and careful responders in depression t(560.41)=2.027, p<.001, anxiety t(558.367)=5.77, p<.001, stress t(558.845)=1.876, p<.001 and overall distress t(558.064)=3.317, p<.001. Careless responders had higher scores of depression, anxiety, stress, and overall distress than careful responders.
Conclusions: Our results suggest that response time data can improve specificity in identifying careless responses of distressed responders, augment interpretations, and add precision to conclusions. Further exploration is needed on the use of response time to enhance clinical decision making in clinically distressed responders.
Faculty: Department:
Isolation and characterization of the polystyrene metabolizing gut microbiome of superworms (Zophobas morio)
Global society produces vast amounts of plastic waste. It is projected that by 2050 twenty-six billion tons of plastic waste will be generated most of which will not be recycled or remediated. Among this waste is expanded polystyrene, a commonly used plastic in packaging of various materials. Polystyrene waste is problematic not only due its environmental persistence but also due to its associated toxicity. It is easier and cheaper to produce new polystyrene than it is to recycle it. Larvae of Zophobas morio commonly known as superworms are able to eat and rapidly degrade ingested polystyrene. Larval gut bacteria including Pseudomonas sp. contribute to this process. The purpose of this research was to isolate and identify bacteria from the gut of superworm larvae that had been grown with polystyrene as the sole source of nutrition. Bacteria were isolated from the gut of superworm larvae that had been maintained on a diet of expanded polystyrene for twenty-eight days. Isolated bacteria were maintained in liquid media with polystyrene as the sole carbon source for sixty days. The enriched culture was used to inoculate plates for colony isolation. Individual colonies were identified as different isolates based on colony morphology. Genomic DNA from individual colonies was subjected to polymerase chain
reaction amplification of the V4 region of the 16SRNA gene. PCR amplicons were sequenced, and the sequences aligned to sequences in the GenBank database using the Basic Local Alignment Search Tool for species verification. Strain identification of bacteria derived from the superworm gut indicated that isolate WEP1SWBLUV20 was Stenotrophomonas sp. Isolates WEP13SWBLUV20 and WEP14SWBLUV20 were confirmed by sequencing to be Pseudomonas aeruginosa. Stenotrophomonas sp. has not previously been identified as a component of the gut microbial community in superworm larvae. The genus Strenotrophomonas is composed of at least sixteen species that have been identified from various sources. The main environmental reservoirs of the Strenotrophomonas spp. are plants and soil. Members of this genera have been found to persist in the gut of the Bark Beetle Dendroctonus rhizophagus throughout its life cycle. In future work the bacteria isolated from the gut of superworms will be further characterized by identifying the enzymes responsible for plastic degradation. Better understanding of the microbial contribution to polystyrene degradation will be helpful in developing remediation strategies for plastic waste.
Faculty: Department:
SCIENCE
BIOLOGICAL AND ENVIRONMENTAL SCIENCES
Phytic acid metal chelation in aqueous solutions
Chelating agents have been used in various treatments for metal-dependent disorders, such as diabetes and cardiovascular diseases. Powerful chelators, such as polyamines and polyacids, were found to have an inhibitory effect on these conditions in humans; however, the mechanisms behind these actions are not fully understood. Previous studies have used titration, electrochemistry, and UV-visible spectrophotometry to assess the ability of chelating agents to bind metals. Phytic acid, a polyacid abundant in whole grains and nuts, has recently gained interest for its selective chelation of bivalent metals. This study aimed to determine the chelation efficiency of phytic acid with important biometals, including iron(II), copper(II), zinc(II), and calcium, using x-ray fluorescence (XRF) spectrometry. The chelating ability of phytic acid was then
compared to other natural chelators, as well as EDTA, a powerful, nonselective chemical counterpart. The chelator was incubated with the metal of interest and subjected to ion-exchange solid-phase extraction (SPE) or centrifugation coupled with acid digestion. Experimental conditions were optimized for each metal-chelator reaction. Complexed and free metals were then analyzed in the SPE eluate or acid digest using XRF quantitation. An energydispersive XRF spectrometer was calibrated to measure target metals in aqueous samples. The chelating ability of phytic acid was found to be greater than other natural chelators and EDTA. In this study, we introduce XRF quantitation as an alternative method to determine the chelating ability of phytic acid and other natural chelators. This study is part of a broader research program on metal-induced glycoxidative stress.
SCIENCE
Faculty: Department:
MATHEMATICAL AND PHYSICAL SCIENCES
Human Fall Detection: A Multimodal Approach
Falls are the primary cause of fatal and non-fatal injuries among the elderly population, according to WHO. Timely assistance is essential to minimize the severity of the injury. Therefore, developing reliable and real-time fall detection systems are highly beneficial in long-term care facilities and healthcare settings.
Current fall detection systems can be categorized into four main categories. The first category is based on wearable devices and sensors that are embedded in clothing or other devices such as smartwatches. These systems provide highly accurate detection due to close proximity to the user. However, their efficiency may be compromised if the device stops working or the user forgets to wear it.
The second category is using ambient devices such as pressure sensors embedded in the floor. These systems can be expensive to install and are susceptible to false positives.
The third category is computer vision-based fall detection systems, where a single RGB camera, multiple RGB cameras, or depth cameras like Intel RealSense and Microsoft Kinect are being used. They are easy and inexpensive to set up, and there is no need for the user to wear any extra devices or modify the surveillance area. However, privacy concerns can arise, which can be addressed through automatic blurring or reduction of the visual data to a more compact
form. Traditional computer vision relies on expert domain knowledge for feature extraction, while learningbased computer vision employs neural networks like Convolutional Neural Networks (CNNs) to automatically extract features. The efficiency of vision-based systems depends on the quality of the extracted features or the architecture and parameters of the neural networks used.
Finally, hybrid approaches use multimodal data sources, such as video, audio, and motion sensors, to perform separate predictions for each modality. These predictions are then combined through mechanisms like voting or stacking to produce a more reliable overall result. While hybrid techniques offer improved accuracy, late fusion mechanisms can ignore important intradependencies in the feature space, potentially leading to missed fall events. Our research focused on developing a real-time multimodal fall detection system based on learning algorithms and comparing its efficiency and resiliency to a simpler fall detection system based solely on computer vision. The proposed system employs an early fusion strategy. Experimental results showed that the proposed model outperformed the traditional single-modality fall detection system, even when some data modalities were missing during testing. In future we will investigate intermediate fusion strategies and compare their efficiency and resiliency to early and late fusion strategies.
Faculty: Department:
SCIENCE
MATHEMATICAL AND PHYSICAL SCIENCES
Microglia 101: How one cell type is connected to all brain disorders
Once considered to be only structural material, glial cells, including microglia, are now rightfully recognized as indispensable to the brain function. Microglia are immune cells residing in the brain that are finely tuned to respond to injury and pathogens. Microglia also aid in development by controlling cell migration, differentiation, and survival. Microglia possess remarkably reactive phenotype that allows them to guard the integrity of the brain. However, in instances of prolonged inflammation, such as Alzheimer’s disease or COVID-19, microglia may develop a malicious inflammatory phenotype that
further damages the brain tissue. Many therapeutic approaches for neurogenerative pathologies center around pushing microglia from an inflamed to a neuroprotective state. My project explored metabolism in microglia, with an emphasis on lactate and cannabidiol (CBD) interaction as anti-inflammatory agents. Despite limited knowledge about microglia’s role in those conditions, microglia remain a popular target due to its merit. Microglia serve as a prime example of how the ability to sense and respond to the environment is crucial for sustaining life.
SCIENCE
Faculty: Department:
BIOLOGICAL AND ENVIRONMENTAL SCIENCES
Low-cost forage management (hay and pasture systems, legume seeding) impacts on productivity and soil health of old grassland: Legacy effects on soil health indicators during establishment year 2022
The loss of perennial hay and tame pasture in Canada throughout the recent decade due to conversion to cropland has potential effects on overall soil health including carbon sequestration and nutrient cycling. As an alternative to the use of fertilizer, inclusion of legumes is known to improve forage quality and productivity due to nitrogenfixing abilities. Legume inclusion can become problematic when conventional seeding practices are used, causing disturbance to the vegetation and soil. Sod-seeding is a less invasive approach to include legumes within perennial stands. The current study (2022-2026) aims to examine effects of sod-seeding legumes on old grass stand as an alternative to the costly application of fertilizer. Measurement of soil health indicators is essential to assess the long-term sustainability of soil resources under different management practices. Since soil health indicators reflect legacy effects of previous management practices, soil health indicators were evaluated during the establishment year (2022), before the application of treatments. Permanganate oxidizable carbon is a measure of the carbon pool that soil microbes can easily access as a source of energy. It was hypothesized that perennial stands would produce higher amounts of permanganate oxidizable carbon due to the higher level of organic matter accumulation compared to annual stands. The soil protein index is a measure of the quantity of proteins
in organic matter, which makes up the largest pool of organic nitrogen in the soil. Microbes can mineralize this form into ammonium and nitrate, making it available for plant uptake. Since soil proteins are tied to organic matter, it was hypothesized that perennial stands would show a higher quantity of proteins compared to annual stands. Soil samples were collected in spring, 2022 before treatments were applied. To measure permanganate oxidizable carbon, soil samples were reacted with KMnO4 and measured via spectrophotometry. The soil protein index was measured using the autoclaved citrate extractable protein method. Soil proteins were extracted in a sodium citrate solution and held at 121˚C. The concentration of proteins in the extract was determined via a bicinchoninic acid assay and quantified against a bovine serum albumin standard curve, measured via spectrophotometry. Permanganate oxidizable carbon was significantly higher under perennial grasses than annuals at the 0-15cm depth and generally decreased with soil depth. The average soil protein concentration was higher under perennial grasses than annuals for the 0-15cm depth, with overall concentration decreasing with depth. The examined soil health indicators showed legacy effects of the historical management of the site. Additional soil health indicators are being assessed yearly to determine the impacts of legume inclusion in mixtures with perennials or annuals.
Faculty: Department:
SCIENCE
BIOLOGICAL AND ENVIRONMENTAL SCIENCES
Biodegradable solid support for heterogenous bioassays
Bioassays are used in a myriad of fields of application, such as diagnosis, monitoring, and posttreatment screening of human diseases. Current methods heavily rely on plastic labware, and therefore generate considerable amounts of waste that persist in the environment.
The aim of this research was to develop an environmentally friendly bioplastic support to be used in heterogeneous bioassays. To this end, a biodegradable polymer was synthesized and tested using a novel derivative of cellulose acetate as an effective and inexpensive backbone. Chemical modifications were conducted to ensure that the desired end product was achieved: a durable bioplastic whose functional groups enable immobilization of common recognition molecules, such as antibodies and aptamers, through covalent linkages. Synthetic methods were selected based on a thorough literature search for individual functionalization reactions. The
order in which they were conducted was vital to ensure an increased number of immobilization sites over the existing methods, such as those based on silica supports. To assess the loading capacity of the support, a microplate was fabricated. A fluorescent group was then attached to the biopolymer surface, and the fluorescence emission was used as a surrogate to the number of loading sites available.
In future, the same methods will be utilized to immobilize custom synthesized, fluorescent-tagged antibodies or aptamers to this solid support. Two ASTM methods will be used to assess the biodegradability of the biopolymer under aerobic and anaerobic conditions, and their byproducts will be analyzed using gas chromatography- and/or liquid chromatography-mass spectrometry. Further tests will be performed to determine the physical properties of the plastic, and its potential reusability.
SCIENCE
Faculty: Department:
MATHEMATICAL AND PHYSICAL SCIENCES
Information Theoretic Measures for Behavioral Authentication Sysems
Entropy such as Shannon entropy is a widely accepted information measure for information and communication systems. However, existing entropy estimators can not capture the variability of biometric features in their measurement and have limitations in measuring the information of biometric systems. Biometric information is a new approach to measure the information in biometric profiles. Kullback-Leibler divergence between features’ distribution of a profile and the population are used to measure the biometric information. Biometric system entropy is a similar approach to biometric information, and is defined based on the ability of the verification algorithm to capture information in the biometric profiles. In biometric systems, for a verification claim a verification algorithm compares the verification data with stored profiles, and produces scores. For N such claims, the Kullback-Leibler divergence between the distributions of valid scores and invalid scores are used to measure biometric system entropy. The goal of this proposal is to study the applicability of these two metrics for behavioral authentication systems so that we compare the security of behavioral authentication system with other authentication systems. Behavioral authentication systems authenticate the users by being acquainted with their behaviors. Behavioral authentication systems construct behavioral profiles for the data of a set of users’ behavioral
features. In particular, here we use these two metrics to measure the unpredictability of behavioral profiles, and the uncertainty to pass a verification claim. More biometric information in a profile means that the features’ distribution of the profile is more distinct which will make the profile more unpredictable. On the other hand, biometric system entropy allows us to understand the identification performance of an authentication system intuitively. More biometric system entropy in an authentication system means more difference between the genuine scores and imposter scores distributions in an authentication system. This will increase the uncertainty of an attacker to pass the verification. We used behavioral data of an existing behavioral authentication system, called eDAC to estimate the biometric information and biometric system entropy. Our experimental results show that on average, each profile of eDAC has 28.71 bits biometric information. Based on this experimental results, we can also say that behavioral profile is less unpredictable than biometric profiles, which is expected. The biometric system entropy for different verification algorithms of eDAC varies from 7.24 bits to 8.25 bits. In comparison to the biometric systems, behavioral authentication systems have less uncertainty to pass the verification, which is also expected.
SCIENCE
Faculty: Department:
MATHEMATICAL AND PHYSICAL SCIENCES
Biomarker Discovery of Metal-Induced Oxidative Stress: 2023 Progress Report
Biomarker discovery has enabled efficient, reliable diagnosis and post-treatment monitoring of human diseases. Many biomarkers have been commercialized for use in medical laboratories to screen for various types of cancer, chronic, genetic, and infectious diseases.
Mass spectrometry is a powerful tool for molecular profiling and, therefore, is often used to identify a molecular signature associated with a biological condition, such as a disease or exposure to an environmental factor. To this end, samples from a target population presenting the condition of interest are collected along with samples from a normal population as the control. Mass spectra from the target group are compared to those from the control, and molecular features unique to the target group are identified as potential biomarkers of the studied condition. Biomarker candidates are then validated in a larger study population and developed into kits for diagnostic purposes.
Oxidative stress occurs naturally in all living organisms leading to a cascade of undesired, potentially harmful biochemical reactions that can damage the organism. One of these reactions is the formation of advanced glycation end-products (AGEs). AGEs are formed from reducing sugars and biological amines, such as amino acids, nucleic acids, and proteins. AGE formation plays a major role in ageing, diabetes,
neurodegenerative diseases, and cardiovascular diseases. Heavy metals, such as iron and copper, are known to accelerate AGE formation by promoting oxidative reactions.
My research aims to identify AGEs in biological samples as potential biomarkers of oxidative stress, and to use them to understand the effect of heavy metals on oxidative stress. Over the past three years at CUE, my research team has developed methods for laboratory synthesis and analysis of AGEs using conventional heating and microwave irradiation. We have used mass spectrometry methods to identify AGEs produced under different reaction conditions, such as varying temperature, pH, and type/concentration of heavy metals.
We have found that natural metal scavengers, such as phytic acid, could reduce total AGE formation under laboratory conditions; however, they may increase the production of a potentially toxic AGE when added as food ingredients (unpublished results). We are currently testing the effect of natural metal scavengers on various biologically important metals. We are, as well, developing an analytical platform to detect AGEs in biological samples using an inexpensive, sensitive bioassay. The AGEs identified at this phase will later be assessed in a human cell model and validated as biomarkers of metalinduced oxidative stress.
Faculty: Department:
SCIENCE
MATHEMATICAL AND PHYSICAL SCIENCES
Blockchain Security Considerations
This book chapter provides a comprehensive overview of the security considerations associated with blockchain technology and discusses the major attacks and potential solutions to mitigate these challenges. The chapter covers security in blockchain technology, OWASP top ten vulnerabilities, blockchain attributes for trustworthiness, attack vectors such as user level, network level, system level, and smart contracts, and blockchain-related attacks at the attack vectors. It also suggests mitigation strategies for different attacks and best practices for companies to mitigate risks caused by these attacks. The chapter highlights some of the significant security concerns associated with blockchain technology. These include 51% attacks, smart contract vulnerabilities, weaknesses in the underlying code, and denial-
of-service attacks. The chapter emphasizes that these security concerns are particularly important for smaller or newer blockchains that may not have been thoroughly tested or audited. Therefore, developers and users of blockchain technology must be aware of these potential security risks and take appropriate measures to mitigate them. The book chapter offers several potential solutions to address these security challenges. These include using a hybrid consensus mechanism, employing multi-signature transactions, and regular code reviews and security audits. The chapter also suggests best practices for companies to mitigate risks caused by these attacks, such as providing regular training to employees, regularly reviewing blockchain security, and implementing a robust incident response plan.
Faculty: Department:
MASTER OF INFORMATION SYSTEM SECURITY MANAGEMENT MANAGEMENT
Telemedicine to Reduce Poverty Using Blockchain
Telemedicine has revolutionized healthcare delivery by allowing remote consultations and diagnosis through digital platforms. On the other hand, blockchain technology provides secure and transparent storage and sharing of medical records, making it an attractive tool for healthcare providers. The integration of telemedicine and blockchain has the potential to reduce poverty by providing efficient and cost-effective care to patients in impoverished areas. This book chapter explores the benefits and challenges of using telemedicine and blockchain in poverty reduction, and potential solutions to overcome those challenges. The chapter discusses the evolution of telemedicine and its integration with blockchain technology. It also highlights the use cases of blockchain technology in healthcare, such as secure storage of medical records and supply chain management. Additionally, it presents various telemedicine and blockchain initiatives implemented worldwide and their impact on poverty reduction. One of the major benefits of the integration of telemedicine and blockchain is increased access to affordable healthcare services. Non-profit organizations providing telemedicine services to patients in impoverished areas have been able to reach more patients, resulting in improved
health outcomes. The chapter also outlines the benefits of blockchain technology in healthcare, including increased security, transparency, and interoperability of medical records. The chapter discusses the application of blockchain technology in telemedicine, including the use of smart contracts and decentralized platforms for secure and transparent consultations. It also proposes a telemedicine architecture using blockchain technology, highlighting the best practices for organizations to set up a telemedicine blockchain. Despite the potential benefits of the integration of telemedicine and blockchain, there are several challenges that must be addressed, such as the lack of infrastructure and technical expertise in impoverished areas. The chapter proposes potential solutions, such as publicprivate partnerships and capacity building programs, to overcome these challenges. In conclusion, the integration of telemedicine and blockchain has the potential to revolutionize healthcare delivery and reduce poverty worldwide. The chapter provides valuable insights for healthcare providers, policymakers, and researchers interested in using these technologies to improve healthcare access and affordability in impoverished areas.
Faculty: Department:
SECURITY MANAGEMENT MANAGEMENT
MASTER OF INFORMATION SYSTEM
RFID Implants in the Healthcare Industries
The utilization of Radio Frequency Identification (RFID) technology allows for the efficient, automated, and immediate collection and transmission of data without human involvement. This technology is recognized as a promising solution for improving patient safety and management. In this study, an RFID-based architecture is proposed for Emergency Medical Services (EMS) to tackle issues like insufficient medical diagnosis, patient identification, and medication administration during emergency situations. These problems arise due to a lack of patient information, including medical history, allergies, and current medical condition. Furthermore, this paper addresses security and privacy concerns related to the proposed architecture and offers recommendations for mitigating these risks.
BACKGROUND
Medical emergencies are a longstanding issue in the healthcare sector, particularly outside of hospitals. Issues to consider in emergency cases include patient identity, access to medical records, inaccurate patient diagnoses, mistakes in the prescription, administration, and distribution of medication, etc., in addition to the emergency medical services (EMS) response time. Radio Frequency Identification (RFID) technology can be used to
Faculty: Department:
address these issues and more in the healthcare sector. RFID is a form of wireless communication between devices and technology that use radio signals to exchange identifying data. This paper looks at how RFID in the form of implants can be used to solve the problems associated with the EMS, in terms of improper diagnosis due to the lack of patient medical history, identification of a patient, administration of treatment without knowing the patient’s allergies and considers the security and privacy issues of implementing the RFID technology.
Our main contributions in this paper can be summarized in the following points:
• Propose an architecture for RFID implants in healthcare systems.
• Analyze the security vulnerabilities of the proposed architecture.
• Mitigate the attacks/vulnerabilities in the proposed security solutions to the architecture.
The main goal of the proposed architecture is to help in the security and privacy of medical data while medical teams have quick access to information. This helps emergency teams have quick access to patient information based on their situation so as to help with medication and saving of patient’s life
MASTER OF INFORMATION SYSTEM SECURITY MANAGEMENT MANAGEMENT
Self-Sovereign Identity in Blockchain Technology
This book chapter provides an overview of self-sovereign identity (SSI) as a revolutionary approach to digital identity management. The chapter begins by tracing the evolution of digital identity management from centralized identity to SSI. The need for a safe, sustainable, and trustworthy digital identity in the virtual world is emphasized. SSI is differentiated from decentralized identity management, and the chapter examines SSI’s identity protocol and architecture, including verifiable credentials, decentralized identifiers, decentralized identity, and decentralized key management systems.
The chapter outlines twenty extended principles upon which SSI systems should be based, including sovereignty, data access control, data storage control, decentralization, verifiability, recovery, cost-free, security, privacy, safeguard, flexibility, accessibility, availability, transparency, portability, interoperability, scalability, sustainability, and longevity. These principles provide a comprehensive framework for establishing SSI systems.
The chapter further discusses the benefits and challenges of SSI systems. Benefits include data privacy in healthcare, efficient
Faculty: Department:
financial services, and staff identification systems. Challenges include lack of legal recognition, user acceptance, interoperability, and scalability. Additionally, the chapter explores the use cases of SSI systems, such as part lifecycle support, competency assurance, Know Your Customers, Non-Fungible Tokens (NFT), and authentication, authorization, and trust of Internet of things user/devices. Finally, the chapter provides recommendations for implementing SSI systems in practice. The recommendations include implementing a usercentric design, ensuring open standards and interoperability, engaging stakeholders, developing trust frameworks, establishing governance and policy frameworks, providing education and awareness programs, and developing secure and sustainable systems.
In conclusion, this book chapter provides a comprehensive overview of self-sovereign identity (SSI) as a revolutionary shift in digital identity management. It examines the identity protocol and architecture of SSI, along with the benefits, challenges, and applications of the SSI system. It also outlines a set of principles and recommendations for establishing and implementing SSI systems in practice.
MANAGEMENT MANAGEMENT
Securing RFID Data Transmission with the KLEIN Algorithm
This research paper addresses the pressing concern of securing RFID systems while ensuring smooth communication between the reader and the tag. The significance of RFID technology has been increasing across various industries, but without appropriate security measures, RFID systems are vulnerable to attacks. With technological advancements, it has become easier for attackers to exploit security weaknesses in RFID systems, thereby increasing the need for robust security methodologies. In this study, we compare different security methodologies and find that lightweight cryptographic
algorithms are the best available option to ensure the security of RFID systems. We analyze various lightweight cryptographic algorithms and choose the Klein algorithm for implementation with RFID systems due to its low power consumption. Additionally, we simulate the software of an RFID system integrated with the Klein algorithm to enhance system security. This study covers three different scenarios: the need for securing RFID systems, the comparison of security methodologies, and the implementation and simulation of the Klein algorithm in an RFID system.
Faculty: Department:
MASTER OF INFORMATION SYSTEMS ASSURANCE MANAGEMENT MANAGEMENT
Analysis of interest flooding DDoS attacks on named data networking (NDN)
Named Data Networking (NDN) is an evolving network technology that promises to overcome the increasing demand for content delivery, scalability, mobility, networking, and current security issues on the Internet. NDN is a named-based architecture, instead of the IP addresses the name of the object i.e., Named Data Object (NDO) is used. Data retrieval works on interest and data packets. The consumer requests the interest packet based on named data, which is routed in the network and fetched from the nearest cache router, then it is returned to the client. Interest flooding has been identified as a type of Distributed Denial of Service
(DDoS) attack in NDN and is one of the major issues in aspects affecting the confidentiality, integrity, and availability of the NDN networks. In this research, we demonstrated the DDoS attack on simple 11-node NDN model and later the similar DDoS scenario was carried on a 100 node NDN model to compare and analyze how the attack affects the nodes and impact the availability of the service. In addition, these experiments are supported with the demonstrated metrics value like bandwidth, cache hit ratio, latency, jitter, throughput, and response time in a graphical representations using four different simulation scenarios.”
Faculty: Department:
Designing and Securing Wi-Fi Connected Autonomous Vehicle
An autonomous vehicle (AV), or a driverless vehicle, is one in which key components such as steering, speed, and braking are controlled automatically by the vehicle and no human intervention is needed. This increases the efficiency of driving, the safety of the passengers, and comfort level. As the use of autonomous vehicles increases, the scope and severity of dangers will also expand. While the cars’ connection to linked technologies like the cloud via Wi-Fi enables increased speed and service quality, it also introduces additional dangers in the form of attacks by threat actors seeking to use these channels for their advantage. While wireless connectivity and cloud computing enable the provision of a diverse variety of dynamic resources, security is widely viewed as a major risk in cloud-connected automobiles. Our project aims to demonstrate the designing and building of autonomous vehicle, including neural network training using machine learning. Furthermore, our project focusses on how to secure wireless connectivity of self-driving cars by detecting deauthentication and Man-in-the-Middle attacks. This is achieved by creating different attack scenarios, performing attack analysis, and applying mitigation strategies. For building autonomous vehicle, we have used Raspberry Pi as the main processing unit acting as master device and sending instructions to Arduino Uno
Faculty: Department:
(slave device). Sign and image detection is achieved through the concept of neural network training used in machine learning.
To demonstrate the deauthentication and man-in-the-middle attack, the Kali Linux machine is used as the attacking machine. Kali has various built-in tools for wireless hacking. For scanning and decrypting wireless networks, Kali Linux comes with an Aircrack-ng and Airodump-ng suite of programs which is used by our team for executing deauthentication attack. To illustrate man-in-the-middle attack, we used two tools namely; Arpspoof and Ettercap tool. Arpspoof is a CLI (command line based) based tool which is used to perform this attack. The autonomous vehicle is the target of an ARP (address resolution protocol) spoofing attack launched by arpspoof. As a result, all traffic between the AV and the default gateway passes through the attacker machine, making it possible to record it using Wireshark. This traffic will also be forwarded by arpspoof. Ettercap is a GUI-based tool that may put the attacker during two machines and then allow the attacker to spoof domain name server. Based on our analysis, enabling virtual private network (VPN) is the best mitigation technique against man-in-the-middle attacks and in case of deauthentication attacks, using Wi-Fi protected access 3rd generation (WPA3) was found to be best the solution.
SECURITY MANAGEMENT MANAGEMENT
MASTER OF INFORMATION SYSTEM
Performance Analysis of Caching Strategies in Flashlight Topology in Named Data Networking
With the use of the Internet of things (IoT) devices and all the other technologies, data is increasing and it is becoming necessary to handle data rather than network devices. Hence, researchers are trying to reduce the use of network devices and improve data delivery or performance and security through ICN. Whether architecture is existing or new, the performance and security issues remain persistent and the need of mitigating is along the way. In ICN, there are many security issues as the need for data is more so data security is also a crucial part. In this paper ,the purpose is to evaluate and determine comparatively better caching algorithm on the basis of parameters and security attacks in a self-built
large topology of 111 nodes named as flashlight. Ndnsim simulator have been used to estimate the differences of performance along with the implementation of caching strategies. Besides, the paper depicts the results of packet drop using caching policies along with the results under security challenges. The differences between the results of caching policies and security attacks are also discussed in the paper. In this paper, the primary focus is to improve the named data networking (ndn) performance by selecting the best approach and trying to mitigate security vulnerabilities along with the existing solutions so that in the future, a new and better performance and security solutions can be proposed.
Faculty: Department:
MASTER OF INFORMATION SYSTEM SECURITY MANAGEMENT MANAGEMENT
Information-Centric Networking (ICN) Based Disaster Recovery and Business Continuity (DRBC) of Bangladesh
We are proposing a technologybased solution for Disaster Recovery Management System (DRMS) for a country such as Bangladesh where disasters are happening almost all around the year. Bangladesh has very well-developed disaster management systems, plans, and processes. Due to well technologically based systems of disaster management in remote locations or during devastating disasters different disaster activities are delayed or failed. Geographically, Bangladesh is disaster-prone; every year, the country faces a lot of economic damage. This is important to have the DRMS ready and available to mitigate, prepare, and communicate with different groups to reduce the loss and save lives. Mostly Technology-based Disaster management is important for Information, relief, shelters, and emergency management. Communication technology can play a vital role in technology-based disaster management. Also, during
a disaster, communication systems and other related infrastructures are damaged due to power failure, and other damage facts. InformationCentric Networking (ICN) based Disaster Recovery and Business Continuity (DRBC) system has a more response ratio, efficient performance, low communication overhead and path distance, and less distributed delay. Also, this has been proven that ICN is a new paradigm with mobility, security, and network traffic that can be applied to highly available secure communication. We are proposing the ICN base Disaster Recovery System for Bangladesh that will efficiently manage the natural and manmade disasters in the country.
Keywords-Disaster- prone, Information-Centric Networking (ICN), Disaster Recovery and Business Continuity (DRBC), Disaster Recovery Management System (DRMS), Communication Technology, Relief, Emergency Management, Shelter
Faculty: Department:
MASTER OF INFORMATION SYSTEMS ASSURANCE MANAGEMENT MANAGEMENT
Security Analysis of CRYSTALS-Kyber Algorithm
The initiative to standardize postquantum cryptography by the National Institute of Standards (NIST) has been actively regulating postquantum cryptographic algorithms.
CRYSTALS-Kyber, a lattice-based key encapsulation mechanism which is being investigated as a standardizing candidate, is examined. NIST recently selected CRYSTALS-Kyber as a new public-key encryption and key-establishment algorithm for standardization, which marks its importance of assessing how well its implementations would withstand side-channel attacks.
In this paper, a profiling side-channel attack against a hardware execution of CRYSTALS-Kyber of security parameter k = 2, Kyber512 is shown. First, power-based side-channel flaws in the Fujisaki-Okamoto transform are discovered, allowing non-ECC methods to leak information about decrypted messages. These vulnerabilities were exploited and the traces were implemented in that
algorithm for demonstrating practical attacks. The aim is to experimentally validate attacks on implementations sourced from the pqm4 library, which are executed on the Atmega328 microcontroller. These attacks result in complete key-recovery in the form of coefficients on all the targeted scheme. Additionally, the attacks can retrieve long-term secret keys in a few hundred chosen-ciphertext queries, indicating the feasibility of the approach. The attack relies on building flawed ciphertexts that ensure that a specific intermediary variable becomes closely linked to the secret key when decapsulated by the target device. An attacker who uses side channels to gain information about the secret dependent variable can then recover the entire secret key. Based on the project, the success rate of performed side-channel attack is around 98% and after further experimentation, the success rate would be reduced.
Faculty: Department:
MASTER OF INFORMATION SYSTEM SECURITY MANAGEMENT MANAGEMENT
Bourdieu’s theory of masculine domination offers an approach to understanding the constraints white cis/hetero men face with respect to love. Stereotypes of masculinity cause men to adopt traits such as stoicism, loyalty, aggression, and not express emotions. As gender relations shift in the wake of the #MeToo movement, men have been challenged to reconsider their position in the gender order. However, as men have been socialized into unhealthy forms of masculinity, they struggle with their relationship with love. This
study engaged with white cis/ hetero men to learn more about their experiences with love through a qualitative approach which revealed that participants were constrained by masculine domination and did not step outside of expressions of love that may have risked their relationship with masculinity. Participants desired to be more emotional, however the risk caused them to at times lean into hypermasculinity to find the love they were seeking.
Faculty: Department:
I wanna know what love is: Making a case for white male love
Teresa FowlerJonathan Strand
On Making a Difference
Background: In recent years there has been a resurgence of philosophical work on the concept of “the meaning of life” or “meaning in life.” Most of this work has focused on one of three things: 1) What we mean by ‘meaning of/in life.’ 2) What factors determine the extent to which this property is exemplified? And 3) To what extent, then, our diverse lives are meaningful.
Most have concluded that there are various things which people refer to as ‘meaning’ in life. Or at the very least, there are various factors which contribute to meaning in life. One thing that many seem to have in mind is ‘significance’—not in the sense of semantic meaning, but in the sense of “mattering.” And this seems to largely amount to “making a difference.” I.e., how different are things from the way they would have been if I had not lived and done what I have done?
Purpose: The aim of this paper is to provide a detailed analysis of the concept of “making a difference.” It is to determine what properties that property has. It will then observe the consequences of that analysis in terms of whether, when, and to what extent our lives are significant by virtue of making a difference.
Method: The paper uses the traditional philosophical methods of conceptual analysis. It draws conclusions about when, and to what extent, something “makes a difference” by observing when and how people apply that phrase. It uses thought experiments to further elucidate the meaning of the phrase.
Results: The properties of making a difference include the following:
1. Something makes a difference when things would have been different than
Faculty: Department:
they are, had that thing not existed or occurred.
2. Various types of things can make a difference: Things that exist, events that occur, actions, states-of-affairs.
3. Things can make a difference in many different, specific ways.
4. It is possible for a thing can make more or less of a difference in a particular way, or overall; and some things can make more of a difference than others in specific ways, or overall.
5. Everything makes some difference merely by virtue of existing.
6. Some differences things make are axiological; a thing can make things better or worse.
7. The manners and degrees to which something makes a difference can vary over time.
8. The differences a thing may make “under an assumption” may differ from the differences it makes under other assumptions.
Conclusion:
1. This analysis seems to entail an answer to an item of current debate: We and our lives are much more significant, in this sense, if God exists than if God does not exist.
2. The analysis also seems to support an answer to The Problem of Evil: God cannot create creatures which have great significance in this sense without granting them the power to make things much better or worse—and so permitting serious evil.
PHILOSOPHY AND RELIGIOUS STUDIES
Climate change and vulnerable young adults’ mental health: A systematic literature review
Following one of the most extreme winters in Canada, climate change is a critical concern for Canadians. The yearly surface air temperature over Canada’s landmass has increased by twice the global average. Climate change is putting Canadians’ health at risk. Specifically, increased ambient air pollution, exposure from ground-level ozone, wildfires, mold, and pollen/spores may exacerbate existing respiratory disorders, and increase the risk of cardiovascular disease, resulting in early death. Young adults are particularly affected and vulnerable. However, limited literature exists that examines the effect of climate change on the mental health of this population. In this study, the overarching aim was to examine young adults’ understanding of climate change in their communities, their experience with anxiety related to climate change, and how they may be better supported to prevent negative mental health implications. Literature was reviewed on PubMed, PsychInfo, and MedLine, and the total number of articles was 500. From these, 30 met the inclusion criteria for analysis. Studies deemed eligible for the review were based on the following criteria: (i) written in English, (ii) published between
Faculty: Department:
2013 and 2023, (iii) published in peer review journals, and (iv) those related to climate change, mental health, and young adults. Findings suggest that younger adults tend to report more significant climate changerelated anxiety. Climate changerelated anxiety also increases in people who care deeply for the environment. Individuals with preexisting mental health problems may be strongly affected by climate change stressors. Additionally, semistructured qualitative interview data is being collected using snowball sampling (current n = 14) to explore lived experiences of young adults with climate change anxiety. We anticipate this data providing insight into programs and interventions the government and local communities needs to address climate change and air pollution issues that cause anxiety. Additional work is required to improve anxiety prevention programs and develop multiple community-based strategies to prepare for – and cope with –climate-related stressors. Work is also required to spread awareness of the effects of climate change to motivate climate-wise decisionmaking and activism amongst the general public.
PSYCHOLOGY ARTS
Virtual Reality in Mental Health: A Systematic Review of Benefits and Concerns
Technological advancements in psychology have developed rapidly, and a seismic shift in mental health services is on the horizon. Virtual Reality (VR) may be the future of psychological services, partly because of its potential to provide clinically relevant information for assessing and treating mental illnesses. VR is a computer-generated simulation that creates an immersive and interactive environment. It typically involves a headmounted display, which provides a visual display of the simulated environment and may also include gloves and other accessories that allow the user to interact with the virtual environment. VR creates a more-or-less experience indistinguishable from "reality," with the user immersed in a simulated world. VR has become increasingly relevant in mental health research and treatment, presenting a range of potential benefits and concerns, including its clinical utility, as well as the reliability and validity of various assessments and therapy outcomes. In this study, we aimed to provide an upto-date understanding of VR in mental health settings, considering the benefits, concerns, and new directions of interest. We conducted a systematic review following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines on MEDLINE, PsycINFO, and SCOPUS databases. We utilized a narrative analysis approach to analyze and synthesize the selected studies. Inclusion criteria included: (i) published in peerreviewed journals; (ii) written in English; (iii) used VR apparatus, (iv) examined outpatient populations, (v) published within the years 2013 to 2023, (vi) discussed
Faculty: Department:
benefits and concerns of VR for the assessment of psychopathology. Studies were excluded if they (i) included interventions with other immersive technologies (e.g., Augmented Reality); (ii) conference proceedings, dissertations, abstracts, case reports, and reviews; (iii) did not report the outcome of interest; and (iv) did not include clinical data. The search yielded 1,473 potentially eligible articles, from which 25 studies met study criteria and were included. Our systematic review revealed that VR-based interventions were perceived as both feasible and enjoyable for individuals with psychosis, allowed individuals to develop skills that lead to functional improvements, such as autonomy and managing everyday home and work tasks, and provided clinicians with precise and consistent control over the presentation of various stimuli. Although there are purported advantages, such immersive environments have no set guidelines for data interpretation, can result in overstimulated behavior for those with neurodevelopmental disorders, and are difficult to use for geriatric populations. Our systematic review revealed a gap in the existing literature: while there is promise in using VR tools to improve psychological assessments, interventions, and outcomes, no research has examined clinicians' perceptions regarding using VR assessments in clinical practice. Such information is crucial to offering a clinical perspective beyond the technology itself. Therefore, future research will conduct a follow-up sequential mixed-method study to understand clinicians’ perceptions regarding the use of VR in clinical practice.
Enhanced surveillance for climate change effects on chronic disease in Alberta: A literature review and qualitative summary of stakeholders and vulnerable populations
Background: Growing effects of climate change on chronic health and wellness disproportionately affects vulnerable groups (e.g., rural communities, Indigenous communities, houseless populations, immigrants, older adults, pregnant women, children, young adults). The ACCLIMATES project is an Alberta-wide initiative to implement a surveillance system to monitor and respond to climate change related chronic health concerns. It is externally funded by the Public Health Agency of Canada.
Purpose: To better understand the effects of climate change and the utility of a surveillance system on chronic health of vulnerable populations, a systematic review of the literature was conducted for each vulnerable group and the role of stakeholders at each level. Qualitative interviews are currently underway to explore lived experience and attitudes of vulnerable groups and stakeholders.
Methods: A systematic literature review was conducted using PsycINFO, MEDLINE, and Academic Search Complete on the effects of climate change on chronic disease in vulnerable populations between 2000 and 2023. Semi-structured qualitative interviews with stakeholders representing vulnerable groups are currently underway (current n = 6). A snowball sampling approach was used for stakeholder participant recruitment with initial emails being sent to key community
Faculty: Department:
organization leaders to assess project interest.
Results: Results of the literature review and initial qualitative data indicated that effects of climate change vary greatly based on the vulnerable group. While children are highly affected by air quality due to sensitivity of their smaller lungs, young adults report climate change anxiety relating to the capacity of the environment to support current industry. Houseless and older populations are particularly vulnerable to extreme heat and cold events. Rural Albertans experience anxiety related to their livelihood as drought limits the agriculture industry’s ability to produce goods. Stakeholders consistently reported barriers in communication, funding, and availability of information for decision making to support their respective vulnerable groups.
Conclusions: Preliminary findings suggest the need for increased attention to climate change effects on chronic disease and wellness across provincial, territorial, and municipal levels, with concerted efforts to both address climate change and respond to increasing rates of anxiety, stress, and chronic disease related to climate change. While stakeholders generally hold positive views towards a province-wide surveillance system to address chronic health effects, such a system must be appropriately tailored to the diverse needs of our diverse province.
Exploring the Use of Somatotypes in Online Reddit Discussions
Background: This research investigates the context in which the terminology of somatotypes (aka body types) are still used today. In the mid-20th century, American psychiatrist William Sheldon developed the idea of the three somatotypes (ectomorph, mesomorph, endomorph) which he believed had a relationship with personality, temperament, and criminality. Despite Sheldon’s constitutional psychology falling out of favour, some of its terminology is still found in popular culture and academic or research spaces. Purpose: Previous research has explored Instagram, Facebook, and Pinterest when looking into body image related issues. However, Reddit is an underexplored social media format, as is the persistence of somatotypes. Methods: This research explored the prevalence of somatotype terminology on Reddit using the R package “”RedditExtractoR””. After removing posts with less than 10 comments as well as any non-English posts, 747 posts remained. A thematic
analysis was conducted using the methods highlighted by Braun and Clarke (2006). Quantitative analyses conducted were a text analysis, sentiment analysis, as well as Latent Dirichlet Allocation (LDA) topic modelling which we directly compared to our thematic analysis. Results/Conclusion: Findings suggest that Sheldon’s somatotypes are still being used in academic research, pop culture, and social media. However, these words are only being used as a method of classification rather than being discussed as having any relationship to human behaviour. Our qualitative findings suggest somatotypes are largely being used in fitness-related discussions, specifically, regarding discussions around changing one’s body type. Themes that emerged include metabolism, exercise regiments, genetics, and more. Furthermore, findings suggest there were more positive sentiments to the discussion around body types compared to what is seen in other research which explores Instagram, Facebook, and Pinterest.
Faculty: Department:
Relation between University Tuition and Minimum Wage in Alberta over 40 years: An Exploratory Archival Study
Studies of sources of student anxiety both in Canada and internationally consistently identify financial stress as a factor contributing both directly and indirectly to students’ anxiety and overall well-being (Jones et al. 2018). In a narrative literature review of 41 empirical research articles designed to identify risk factors for poor mental health outcomes (stress, anxiety, and depression) in both developed and developing countries over a 20 year period (2000-2020) Moffateh (2021) identified insufficient financial support and financial worries as one of six risk factors for poor mental health outcomes. This finding was consistent across the 10 countries in which these studies were conducted. Higher levels of stress and anxiety among students are, in turn, associated with lower overall perceived quality of life (Ribero et al., 2018), a variety of other risk factors,such as food insecurity (Hattangadi et al., 2019), and number of hours worked to meet financial needs are, in turn, negatively correlated with academic performance (Hui, 2014).
Our hypothesis, based on our own experiences as undergraduate students in the 1980s, was that the
Faculty: Department:
relation between minimum wage and tuition has changed over time, such that increases in tuition and other mandatory fees have significantly outstripped increases in the minimum wage. We also hypothesized that the amount of student debt would have increased significantly across the same period. Our focus was on Alberta in particular, although we also examine tuition increases in provinces across Canada. Consequently, we gathered tuition and minimum wage data from archival sources (i.e., course calendars from the University of Alberta; Government of Alberta databases) to evaluate the relation between these two policy-related factors that influence whether students will be able to comfortably finance their university tuition and fees. As expected, we found that in the early 1980s a student working a minimum wage job for 4 months could pay their tuition 3 times over, whereas by the late 1990s the same student could pay their tuition, but would have no money remaining for other expenses. We conclude that tuition and minimum wage policies work together to influence the financial risk factors to students’ success and well-being.
Using 3D-Printed Models in Learning Geological Concepts
Teaching science courses often involves the use of visual aids, such as computer renderings or physical models. 3D printing provides a unique opportunity for instructors to use physical models in the classroom for in-person learning. Nonetheless, no studies have shown if a better perception of scientific concepts and phenomena can be achieved by using 3D-printed models in teaching geoscience courses. This study is aimed at analyzing the behavioural aspect of retaining information by students based on their interaction with 3D-printed models of a chosen geoscience concept. Our research question is whether 3D-printed models can help undergraduate students with the comprehension and retention of geological information and concepts from textbook figures or computer models displayed on the screen. We designed a synthetic model of a geological terrain as images on a paper, interactive models on a computer screen, and 3D-printed objects. The project included three stages: the study stage and two test stages. The study stage took place during one class for the students enrolled in a firstyear environmental science course who were divided into four groups:
1) paper group, where participants had only a printed image of the geological terrain;
2) computer group, where students had the same model but in a digital format displayed on the screen;
3) 3D group, where only 3D-printed model was handed out to participants; and
4) integrated group, where participants had access to all model types of the same terrain - paper, computer, and 3D-printed. The first test stage took
Faculty: Department:
place during the same class as the study stage. Participants were assessed with 15 questions based on their understanding of morphology and physical features of the terrain studied using the Qualtrics platform. In addition, participants completed the survey about their experience and attitude toward the learning efficiency for paper, computer, or 3D-printed models. According to the results from the first test stage, on the knowledge test (out of 15 questions), the integrated group showed the highest performance with a mean (M) of 6.26 and a standard deviation (SD) of 1.97, followed by the paper (M=6.22, SD=1.35), computer (M=6.15, SD=1.46), and 3D model groups (M=5.44, SD=1.69), respectively. According to the survey responses, students most strongly endorsed the usefulness, ease, and comfort of using 3D-printed models (for both integrated and 3D model groups) to learn the geological material, followed by computer models, followed by images on the paper. While survey findings suggest that students may prefer using 3D-printed terrains (as compared to paper and computer models) for a better learning experience, the knowledge test results suggest that the paper-based materials (e.g., handouts or textbooks) may provide a more effective way of retaining information. Findings from the second test session (which will take place two weeks after the first test stage) will allow us to examine the longer-term retention of information about the model and will contribute to the broader scope of finding more efficient methods of teaching science using tangible models.
PSYCHOLOGY ARTS
Hope Sabo, Carolina Mendes, Aaron Bender, Christina Gagńe, Thomas Spalding, Alex Taikh
The influence of constituent and pseudo-constituent activation in reading compounds and pseudo-compounds
Background: The morphemes embedded in compounds and pseudo-compounds become available and influence the identification and reading of words. Compounds are combinations of two or more constituent morphemes (e.g., high and light function as morphemes in highlight). Pseudo-compounds are words where the embedded words do not function as morphemes (sea and son do not function as morphemes in season). We examine the role of the morpheme boundary letters (highlight or season) in accessing the embedded morphemes when processing compound and pseudocompound words.
Purpose/Aim/Hypothesis: We compare the effect of interfering with boundary letters and constituent-internal letters on the identification of compound and pseudocompound words. We hypothesize that boundary letters contribute more than constituent-internal letters to accessing compounds but not pseudo-compounds. Thus, interfering with the boundary (rather than constituent internal) letters will slow down recognition of the compound words but not pseudo-compound words.
Method:Across four experiments, we examine whether the interference from changing letters (replacement vs. transposition) depends on the position of the letter change (morphemic boundary vs. inside of a constituent). In Experiment 1, we compared the effect of quickly presented compound words (masked primes) with replaced or transposed letters at the morphemic boundary (higmkight vs. higlhight) or inside of the first constituent
Faculty: Department:
(hbohlight vs.hgihlight) on the recognition of compound words (highlight). Experiment 2 compared the effect of masked pseudocompound primes (season) with replaced and transposed letters at the morphemic boundary or inside of the first constituent on the recognition of pseudo-compound words. Experiments 3 (compounds) and 4 (pseudo-compounds) used masked primes with replaced and transposed letters at the morphemic boundary, or inside of the second constituent.
Results: Replacing (vs. transposing) letters slowed down identification of compound words when the letter changes were at the boundary but not inside of the first constituent (Experiment 1) or the second constituent (Experiment 3). For pseudocompounds, replacing (vs. transposing) letters slowed down word identification when the letter change was at the boundary and inside of the first constituent (Experiment 2).
Conclusion: Our findings suggest that in both compounds and pseudocompounds, replacingboundary letters interferes with activating the embedded morphemes. Replacing constituentinternal letters, however, interferes with activating embedded morphemes in pseudo-compounds only. In compounds, the unaltered morpheme may help compensate for the morpheme where the letters have been replaced. In pseudocompounds, the embedded words do not really function as morphemes, and do not provide compensation for the interference that comes from replacing letters.
PSYCHOLOGY ARTS
Sequences of Grundy values in Node kayles
Node kayles is a two-player game played on a graph in which each player alternately selects an empty space that is not connected to previously selected space. The game ends when neither player can make any legal move and the player who selects the last space wins
the game. A value which is used to analyze the game is called Grundy value. We study the Grundy values of Node kayles while playing on different graphs, including paths. The focus is on determining if there is any periodicity in the sequence of Grundy values we calculated.
SCIENCE
Faculty: Department:
MATHEMATICAL AND PHYSICAL SCIENCES
Exploring High Temperature Positions in Domineering Game
In the game of Domineering, two players named Left and Right take turns placing vertically and horizontally oriented dominoes on a board. Each domino covers two adjacent squares and cannot overlap or go beyond the board’s edges. The temperature of the current position is a measure of the urgency of the next move. Studying temperature
is crucial because the next move in some games can heavily impact the game’s final outcome. Berlekamp’s conjecture suggests that the maximum temperature in Domineering is 2. We developed a code to search for positions in Domineering with temperatures close to 2, and discovered several positions exhibiting similar patterns.
SCIENCE
Faculty: Department:
MATHEMATICAL AND PHYSICAL SCIENCES
Radical-initiated hydrosilylation: A convenient approach to mixed surface chemistry silicon quantum dots
Silicon nanoparticles (SiNPs) are toxic metal-free quantum dots that exhibit photoluminescent properties that make them attractive for far reaching applications including medical and biological imaging. Previous hydrosilylation-based approaches have provided some limited access to reactive surfaces, however competing reactions present challenges when target surface groups include pendant reactive moieties. Previously, our
group demonstrated that radicalinitiated hydrosilylation provides access to partially functionalized SiNPs that could possibly be further modified via subsequent reactions. This presentation will outline our recent investigations into the development of a stepwise functionalization protocol that provides access to mixed surface SiNPs through different types of hydrosilylation approaches.
SCIENCE
Faculty: Department:
MATHEMATICAL AND PHYSICAL SCIENCES
Anionic
to Form [3.3.0] Bicycles
Fused bicyclic rings are prevalent in a wide range of natural compounds that show varying biological activity. The development of efficient synthetic methods that form these key ring junctions is of constant interest. Electrocyclizations offer an approach that benefits via stereocontrol at the ring juncture, as well as chemoselectivity of its initiation. Cross-conjugated
cyclooctadienones were identified as an easily accessible class of substrates that cyclized after formation of a pentadienyl anion. Access to this anion was explored via direct 1,2-Brook rearrangement; via a step-wise reduction/silyation/ carbolithiation sequence; and via a reduction/carbamation/ carbolithiation sequence.
SCIENCE
Faculty: Department:
MATHEMATICAL AND PHYSICAL SCIENCES
Construction of 1,3,6-Trien-ones as Substrates for 7-Carbon Anionic Electrocyclizations
Electrocyclizations are a well established method for the generation of carbocyclic rings with stereocontrol at the ring-closure position. Within this framework, very little has been reported on anionic methods to form the medically relevant heptacyclic ring structure. In order to investigate the electrocyclization event, a method was developed to synthesize the required precursor substructure. A Knoevenagel condensation reaction
was used to install two alkenes directly, while also maintaining an electron-withdrawing ester off of the skeleton at C-4. A series of starting keto-esters, as well as unsaturated aldehydes, was employed. Selective reduction of the ketone was also investigated. This series produced a rapid and effective means to produce the necessary substrates that can now be investigated for the electrocyclization.
SCIENCE
Faculty: Department:
MATHEMATICAL AND PHYSICAL SCIENCES
Functionalization of Silicon Nanocrystals via
Hydrosilylation is a widely explored method for modifying the surface chemistry of silicon nanoparticles in order to enhance their functionality. These silicon nanoparticles hold significant potential, and are attractive due to silicon’s abundance in the earth’s crust and its ability to be used in biological and environmental settings due to its non toxic effects. These particles have diverse optical and electronic properties that are dependent on their size and surface molecules, making them suitable for a range of potential applications, including biological imaging, light emitting diodes and drug delivery. This study investigates the use of a hydrosilylation method, sonication, to functionalize silicon nanoparticles
with dodecene. Functionalization is performed both with and without the use of a radical initiator, azobisisobutyronitrile (AIBN). Compared to other methods such as thermal, photochemical, metal catalyst and radical hydrosilylation, sonication offers several advantages such as particle size independence, room temperature processing, and the production of a monolayer surface passivation without residual catalyst. The results demonstrate that sonication with dodecene and AIBN is an effective method for silicon nanoparticle functionalization. Without AIBN, the process yields similar functionalization results and is absent of impurities from the radical initiator, but is significantly slower.
Faculty: Department:
Determination of Internal Standards Candidates for the Quantification of Advanced Glycation End-Products in Food
Advanced glycation end-products (AGEs) can be formed by Maillard reactions that develop during various methods of cooking foods, including roasting and grilling. AGEs are a common product present in the human’s daily diet, however when consumed in high volumes, these can lead to unwanted health risk with the most common being diabetes. Presently, the detection of AGEs is very difficult due to the fact that there is such an extensive potential product outcome due to the various conditions and environments where they can form and the fact that when they do form, it will be in a very low concentration, making the detection processes even more difficult. This research aims to provide a means of comparison that can be used in future experiments for the detection and quantification of AGEs. We have used reverse-phase highperformance chromatography (RPHPLC) to discover a potential internal standard that could be used in comparison to known AGE samples. Several candidates were selected
based on their similarities when compared to previously tested AGE samples, with caffeine and xanthine showing the highest potential. Xanthine was later removed as a potential internal standard due to its low solubility in water and caffeine proved to show greater peak geometry when tested. AGE samples were prepared by incubation with lysine and glucose with either iron or copper at 37C for up to 8 weeks. Once colour change had been seen, samples were concentrated under nitrogen and analyzed with HPLC-UV detection, along with various concentration samples of caffeine. These chromatograms were compared to determine a suitable concentration of caffeine that could be used in comparison to the low concentration of AGEs present in each sample and once this was determined, the AGE samples were spiked with caffeine. Samples that contained both AGEs and caffeine were analyzed with HPLC-UV detection to ensure caffeine was a viable internal standard.
Faculty: Department:
Efficient Energy Use: Can We Make a Shift in Consumers’ Behaviour?
Climate change has negatively impacted the environment by relying on fossil fuels for energy, creating an unsustainable lifestyle for humans. Community members can incorporate some simple energy-saving habits to reduce the negative impact on climate change. The purpose of this study is to identify if the energy consumption behaviour of residents from Edmonton and surrounding areas affects their perception of climate change and the transition to alternative energy sources. Our methodology involved two parts where online surveys were distributed among all Concordia University of Edmonton (CUE) members. The first part included the survey to identify the type, size, and approximate location (via a postal code) of participants’ homes as well as to learn about their energy consumption patterns based on the building’s insulation, lighting, heating sources, and daily use of electronic devices and appliances. The second part included a participants’ interaction with the newly developed energy calculator (in cooperation with McNeil Centre for Applied Renewable Energy at CUE) and the follow-up survey to inquire about current sources of heating, lighting, and appliances in the participants’ homes. After completing the second survey, participants provided feedback on the functionality and the most useful features of the online energy calculator. Based on their responses, the calculator generated recommendations on what
types of new energy-saving habits participants could incorporate daily to pay lower utility bills and reduce the pressure from greenhouse gas emissions on our environment. The preliminary results from 18 participants showed that the main electricity source was the power grid, and the main source of heating was the gas-forced air central furnace. It was found that 95% of participants would be willing to switch to green energy sources if they had the opportunity and enough incentives from the federal and provincial governments. Out of all participants, nine CUE members indicated that they installed or inherited some energy-saving equipment or lighting sources within their homes. In addition, Insulation in walls and windows plays an important role in energy efficiency for buildings. By comparing participants’ homes that ranged from one to 50 years in age and analyzing the number of window panes and any renovations completed within the last 15 years, there was a trend of using less energy (in kWh) in the properties with triple-glazed windows and recent renovations. The next phase of the study will involve the use of training data acquired through surveys in the first phase to verify the research question of a continuous transition into efficient energy consumption. The outcomes of this study will be of paramount importance for communities to deliver the necessary knowledge to transform into a low-carbon economy.
Faculty: Department:
SCIENCE
BIOLOGICAL AND ENVIRONMENTAL SCIENCES
Improved multifactor authentication with machine learning:A novel approach to improve safety
Background: With the increase in security breaches and cyberattacks, multifactor authentication (MFA) has become a popular approach for securing online accounts and systems. MFA uses multiple factors such as passwords, biometrics, and security tokens to authenticate a user’s identity. Machine learning (ML) is being explored as a potential tool for improving the security and usability of MFA systems.
Purpose: The purpose of this review is to provide an overview of the current state of research in the field of MFA using ML, identify strengths and limitations of existing approaches, and indicate potential areas for future research.
Method: A systematic literature search was done using several online databases. Search criteria were based on keywords related to MFA and ML. Study selection criteria were based on relevance to the research question and study quality.
Result: This review identified multiple approaches for using ML in MFA,
including using ML for biometrics, password prediction, and anomaly detection. The strength of these approaches includes improved accuracy, efficiency, and usability. However, there are also limitations such as the need for large datasets, the risk of bias, and the potential for attacks on ML models. Overall, this review highlights the potential of ML to improve MFA systems but also highlights the need for further research to address limitations and challenges.
Conclusion: MFA using ML is an emerging area with the potential to improve the security and usability of online systems. However, there are still limitations and challenges that need to be addressed before ML is widely adopted in MFA systems. Future research should focus on developing more robust and secure ML models, addressing bias and fairness issues, and exploring new approaches to MFA using ML.
Faculty: Department:
Metabolism in microglia: battling neuroinflammation with
lactate and cannabidiol
Alzheimer’s and other neurodegenerative diseases are analogous in chronic inflammation of microglia, which are “housekeeping” cells in the brain. Microglia normally protect brain from pathogens and control homeostasis. However, chronically inflamed microglia exacerbate cell death and worsen the disease. Attenuation of inflammation in microglia has been heavily researched as a potential therapeutic strategy, with molecules such as lactate and cannabidiol (CBD) found to be anti-inflammatory. Notably, CBD was found to upregulate the expression of Aquaporin-9 (AQP9) - a membrane protein that moves small molecules like lactate into the cell. Thus, I hypothesized that CBD and lactate can synergistically
reduce inflammation, as upregulated AQP-9 would pump more lactate inside the cell, thereby increasing the anti-inflammatory effect. I assessed inflammation in microglial cell culture by measuring the release of nitric oxide (NO), an inflammatory signaling molecule. To explore interaction of lactate and CBD I quantified the expression of AQP-9 protein, and lactate concentration. I performed a 3-way ANOVA test, and identified significant effects of individual treatments with lactate and CBD. However, no evidence of a synergistic effect between the two was found. Nonetheless, other studies suggest an interaction between cannabinoid and lactate metabolism in microglia. It remains to be explored what that interaction is.
BIOLOGICAL
Yolov5-based Material Recognition for Waste Classification
Object recognition is a critical problem in the field of computer vision, and it plays a crucial role in numerous applications, such as autonomous driving, security systems, and robotics. Traditional object recognition algorithms rely on a twostep pipeline, which involves detecting the Region of Interest (RoI) through segmentation or localization, followed by object identification or classification. However, this approach can be timeconsuming and computationally expensive, especially when dealing with large datasets or real-time applications. To overcome these challenges, modern approaches perform localization and classification simultaneously, making the object recognition process significantly faster and more efficient. One of the most accurate models in terms of localization and classification is YOLOv5. It was released in 2020 and built on the YOLOv4 architecture using the PyTorch library. Previous experiments have shown that the model achieved a speed of 200 FPS on an NVIDIA V100 machine and a mean Average Precision (mAP) of 55.8% on the COCO dataset.
This performance is comparable with YOLOv8 in terms of mAP and outperforms it in terms of speed. Despite the efficiency and reliability of YOLOv5, the original model is limited to 80 classes of objects only. However, the architecture can be customized and modified to recognize other objects or classify based on other aspects of the objects. In this paper, we
present a customized object recognition model based on the modified YOLOv5 architecture, which classifies objects based on their material into five categories, including glass, cardboard, metal, paper, and plastic. We use the garbage classification dataset from Kaggle to train the proposed network, consisting of approximately 2500 labeled images in the mentioned five classes. To improve the accuracy of the model, we further annotate the dataset using the Roboflow system to generate closed bounding boxes and have better localization. We also added an output layer to the modified YOLOv5 architecture for material recognition. The experimental results showed that the proposed model accurately recognizes the material of the objects, even in the presence of heavy occlusion and variations in illumination conditions. This proposed model can be used in waste management centers to improve the efficiency of recycling strategies and eliminate manual interventions. Furthermore, it can be applied to other domains where the classification of material plays a crucial role, such as manufacturing or construction. In conclusion, the modified YOLOv5 architecture proposed in this paper is a promising approach to recognize objects’ materials, with high accuracy and efficiency, making it a suitable solution for various real-time applications. However, further research is needed to improve the model’s performance and extend it’s capabilities.
Faculty: Department:
SCIENCE
MATHEMATICAL AND PHYSICAL SCIENCES
The ferroptotic effect of alumina nanoparticles on breast adenocarcinoma cells
Ferroptosis is a novel form of regulated cell death. It is dependent on iron in the cell, and it causes cell death through the increase of oxidative stress in the form of lipid peroxide buildup. This iron dependence makes it a possible pathway to target for cancer therapy, as cancer cells use more iron due to their constantly heightened metabolism. Since they have more iron, ferroptosis is more likely to occur in these cells. This difference between cancer and healthy cells provides selectivity for possible cancer treatments exploiting ferroptosis. This study aims to examine the ferroptotic effects of alumina nanoparticles on breast adenocarcinoma cells for the sake of determining possible future treatments for cancer. It was hypothesized that firstly the nanoparticles would induce ferroptosis, secondly the greater the concentration of nanoparticles, the more ferroptotic activity there would be, and thirdly, ferroptotic activity would decrease with nanoparticle size increase. The study tested this by administering several different sizes and concentrations of alumina nanoparticles to the cancer cells in vitro. Ferroptosis was measured by image analysis of photos of cells stained with BODIPY-c11, a fluorescent tag that changes fluorescence from red to green when exposed to lipid peroxides. Due to this change, the intensity of green fluorescence compared to red in a given sample can be measured and quantified with software. The results support
the first part of the hypothesis, in that ferroptosis occurred in the cells at all in response to nanoparticle administration. However, the second and third parts of the hypothesis were contradicted by both the results, and the statistical analysis of the results. They showed that overall ferroptotic activity decreased with increased particle concentration, and that ferroptotic activity increased with increased particle size. As well, the ANOVA performed on the results of the effects of nanoparticle concentration and size showed that there was no statistical significance of either factor on ferroptosis in this study. The inverse relationship between concentration and ferroptotic activity may be due to higher concentrations of nanoparticles causing clumping, disallowing the cell from endocytosing the particles efficiently. The relationship between nanoparticle size and ferroptotic activity may be due to the fact that different sized substances get taken up by cells by different ways depending on their size, and that some of those pathways are more efficient than others. Future studies similar to this should use cell counting as another method of measuring ferroptosis, and should use erastin as a positive control instead of paclitaxel, as it is more understood to cause ferroptosis. More research should be done on the effects of different nanoparticles on different cancers, and on the toxicity of nanoparticles.
Faculty: Department:
SCIENCE
BIOLOGICAL AND ENVIRONMENTAL SCIENCES
Principal well-rounded ideals and applications
Principal well-rounded ideals are mathematical structures with applications in coding theory, cryptography, and discrete optimization. Using results from previous research and mathematical software, we generated thousands of integer pairs which can be used to construct principal well-rounded
ideals in a given quadratic field for application in the fast-fading channel in coding theory. We analyzed the data for patterns and found that, if such a pair exists, then we can generate infinitely many other pairs providing infinitely many principal well-rounded ideals for the code design.
SCIENCE
Faculty: Department:
MATHEMATICAL AND PHYSICAL SCIENCES
A mobile based solution to identify fruits and vegetables using machine learning model during grocery shopping
Grocery shopping has become an essential part of our daily life. Starting from stocking up weekly supplies to picking up few items on the go, grocery shopping is something that we cannot avoid. As of 2021, The grocery shopping market size in Canada was approximately CAD $120 billion, according to a report by Statista. The grocery shopping experience has evolved significantly over the years. Traditionally the stores used to keep workforces to scan the customer items during checkout process. A report by the National Retail Federation (NRF) in 2019 stated that the average checkout time in a grocery store was around 3-4 minutes. Now most of the big retailers have introduced self-checkout counters where customers scan each item by themselves. A problem occurs when items like fruits and vegetables do not come with barcodes. Customers must recognize the items and key in their name manually on the screen to complete the checkout. A common problem is customer faces difficulty in identifying items when there are variety of items like tomato, pepper, or capsicum. Customer often mistype the item. This increases errors and adds delay in self-checkout counter. Solution to this problem could be an automated way of identifying items placed on the weighing scale. This could save a substantial amount of time in selfcheckout counter. This paper presents a mobile-based solution that utilizes machine learning algorithms to identify fruits
and vegetables in real-time for grocery shopping.
The proposed solution comprises a mobile application that uses the camera of a smartphone to capture images of fruits and vegetables. The images are then processed using a deep learning model trained on a large dataset of fruit and vegetable images to identify the type of produce. The solution is designed to be user-friendly and intuitive, enabling shoppers to easily and quickly identify the produce they want to purchase without the need for expert knowledge or assistance. The paper discusses the implementation of the mobile application, which includes features such as a user-friendly interface, real-time image processing, and seamless integration with grocery shopping applications.
The proposed solution has several advantages over traditional methods of fruit and vegetable identification, such as expert knowledge and manual inspection. It is convenient, fast, and accurate, and it enables shoppers to make informed decisions about the produce they purchase. It can also be easily extended to other types of products, such as packaged goods, providing a comprehensive solution for grocery shopping. Overall, this mobilebased solution represents a significant step forward in using machine learning to improve the shopping experience for consumers.
Faculty: Department:
MATHEMATICAL AND PHYSICAL SCIENCES SCIENCE
Attention based neural networks for protein structure prediction
Protein structure is one of its most important characteristics. It is through its structure that a protein is able to interact with each other and different molecules. Therefore, the ability to properly predict how the 3D structure of a protein is given configures a key advancement in the process of discovering new forms of fighting infectious diseases or even tackling environmental problems.Machine learning has been proving a great ally on the path to achieve an effective estimation of the protein structure down to presenting atomic precision on its angles of fold. New methods have been developed and revolutionized the field. The topic of the research is the use of a simplified Attention based neural network for predicting and visualizing protein angles from amino acid sequences with precision. Attention networks work by applying different weights on different parts of the input in order to retrieve which is the information that the model should focus its attention on. This approach can be applied to properly identify the
critical interactions between different parts of the protein sequence that contribute to its folding. These important parts of the sequence can include amino acids that form critical interactions, such as hydrogen bonds, without necessarily having a score function for hydrogen bonding, wich makes it more efficient on a limited dataset while being able to cope with the variety and complexity of the structural data, as assessed by benchmarking on CASP13 and CASP14 datasets. Overall, attention based networks have proved to be a promising lead of research on protein folding, with a great potential to enhance our understanding of the structures and interactions of proteins. The intrinsic challenge of experimental structure determination has prevented an expansion in our structural knowledge before, but these new methods, when allied to a large and well-curated database of structures and sequences can provide grounds for a fast evolving body of knowledge on bioinformatics and other biophysical problems.
Faculty: Department:
PHYSICAL SCIENCES
Ethics in AI: Detecting and Reducing Bias in AI Algorithms
As artificial intelligence (AI) systems become more prevalent in our daily lives, it is crucial to address biases in these systems. AI algorithms are used in a variety of fields, such as finance, healthcare, and criminal justice, where their choices can have a big impact on people and communities. To the contrary, biases in training data or model design can result in unfair treatment and exacerbate already-existing inequalities, disproportionately affecting marginalised groups. Therefore, it is critical to identify and address biases in AI algorithms in order to ensure that AI systems operate justly and morally, advancing social justice and increasing confidence in AI technologies.
The goal of this project is to create a thorough framework for identifying and reducing biases in AI algorithms. To do this, we will start by conducting a thorough literature review to pinpoint current approaches, problems, and industry best practises. Then, we’ll look into various real-world datasets to find any potential biases and evaluate algorithmic fairness using statistical techniques, visualisations, and other quantitative methods. We’ll suggest cutting-edge methods and tactics to reduce biases found in AI
models, making them more impartial, fair, and moral. Our methods’ efficacy in fostering fairness and minimising biases in AI systems will be tested using a range of AI models and datasets.The project’s originality lies in the creation of cutting-edge bias mitigation methods, which will advance morally upright AI practises and research.
This project will be beneficial to society by fostering fairness, equity, and trust in AI applications by addressing the crucial problem of biases in AI algorithms. AI practitioners and researchers will be able to design and implement AI systems that minimise discrimination and guarantee fair treatment for all users thanks to the development of a comprehensive framework for bias detection and mitigation. Additionally, the dissemination of our research and created tools through scholarly works, open-source software, and instructional materials will promote a deeper comprehension of moral AI principles and practises. In the end, this project will help build a more equitable and diverse society where AI technologies can be used to empower people and communities instead of preserving current inequalities.
SCIENCE
Faculty: Department:
MATHEMATICAL AND PHYSICAL SCIENCES
Multimodal Speech Emotion Recognition (MSER) is a crucial area of research that has gained significance in recent years. Accurately detecting emotions from speech is essential in many applications like human-robot interaction, virtual assistants, and mental health monitoring. The use of multiple modalities, including audio, visual, and textual features, has shown promising results in improving the accuracy of speech emotion recognition systems. The importance of MSER lies in its potential to enable machines to understand and respond appropriately to human emotions, which is critical in various fields, including healthcare, education, and entertainment.
Accurately recognizing human emotions can help in understanding the emotional state of individuals, which can aid in providing personalized treatment plans for individuals with mental health issues. In education, multimodal speech emotion recognition can help in evaluating students’ emotional state, thus enhancing their learning experience. MSER is a challenging task as emotions are subjective and vary depending on cultural and individual differences. Therefore, developing robust and accurate models that can accurately recognize emotions in real-time is crucial for its successful implementation in various domains. The novelty of this field lies in the fact that it can capture more comprehensive
information about the emotional state of a person, leading to more accurate recognition of emotions. The outcomes of this research are widespread, ranging from helping individuals with mental health disorders to enabling the creation of more emotionally intelligent human-machine interfaces. The ability to accurately detect emotions in realtime can have a significant impact on society and improve human-to-human and human-to-machine interactions. MSER can have significant benefits for society and communities. One of the key benefits is improved mental health care, especially for people who may struggle to communicate their emotions effectively, such as individuals on the autism spectrum or those with mental health disorders. MSER can also be used in various industries, such as customer service and entertainment, to analyze and respond to customer feedback or emotions. In the education sector, MSER can help in detecting and addressing student emotional issues, improving their learning outcomes. In addition, MSER can aid in the early detection of mental health issues, which can lead to better treatment outcomes and improved quality of life for individuals. Overall, MSER has the potential to significantly improve emotional wellbeing and mental health care, making it an important area of research for the benefit of society as a whole.
SCIENCE
Faculty: Department:
MATHEMATICAL AND PHYSICAL SCIENCES
Emotion recognition from speech signals- a multi-model approach with deep neural networks
Characterization and identification of polystyrene degrading bacteria extracted from the gut of the mealworm; Tenebrio molitor
Since the 1950’s, over 9 billion tons of plastic have been produced globally (Royer et al. 2018). Large production volumes have led to the accumulation of plastic wastes in our environment, with over 129 thousand tonnes of waste found in the Great Pacific Garbage Patch (Lebreton et al. 2018). One popular form of plastic is polystyrene. Polystyrene comes in both a hard form, and a foam form more commonly known as Styrofoam. Both forms however are considered non-biodegradable as these plastics take over 500 years to breakdown (Nakatani et al. 2022). Tenebrio molitor larvae, more commonly known as mealworms, have been identified to have gut bacteria that is capable of metabolizing polystyrene into carbon by-products . Although some polystyrene degrading bacteria have already been identified through research, different mealworms have different cultures of bacteria living within them, and therefore it is possible that further research will unveil new, potentially more effective, polystyrene degrading bacteria. This experiment will isolate previously extracted bacteria into monocultures, which will then be grown in a carbon free media with polystyrene. Using a spectrophotometer, their growth will be measured as an indication of polystyrene degradation. Their DNA will then be extracted using a QIAGEN blood and tissue kit, and the concentration of this DNA will be measured using a nanodrop. This DNA will be amplified by random amplified polymorphic DNA
polymerase chain reaction. The reaction products will then be run through an agarose gel in order to test for genetic variation. A standard polymerase chain reaction will also be performed to amplify the V4 region of the 16SrRNA gene. These reaction products will also be run through an agarose gel, with the DNA band in the gel being extracted and purified using a QIAquick gel extraction kit. This purified product will then be sent to Genome Quebec for sequencing analysis. Once the sequence is returned, the unknown bacteria can be identified through the National Centre for Biotechnology information database. Out of the 31 bacteria that were originally isolated into monocultures, 29 showed successful growth on nutrient agar plates. DNA was extracted from each of these 29 bacteria, and the samples with the 15 highest concentrations were selected to proceed with the experiment. Out of these 15 samples, 11 showed an increase in growth in the carbon free media, suggesting positive polystyrene degradation. One sample suffered DNA degradation during the standard polymerase chain reaction. In total 14 samples were sent to Genome Quebec for sequencing, with expected results and final identification to occur the week of April 9, 2023. This work has the potential to identify unique bacteria associated with polystyrene degradation and add to the current literature of identified polystyrene degrading bacteria.
SCIENCE
Faculty: Department:
BIOLOGICAL AND ENVIRONMENTAL SCIENCES
Radiogenomic Analysis for Personalized Breast Cancer Prognosis and Treatment Optimization
Abstract: Worldwide, breast cancer affects more women than any other type of cancer, and the number of new cases being identified each year is rising. For enhancing survival rates and reducing treatmentrelated side effects, early discovery, precise prognosis, and individualised treatment approaches are essential. By locating novel biomarkers and describing tumour heterogeneity, radiogenomics an interdisciplinary subject integrating radiomics and genomics offers a viable route for improving breast cancer prognosis and treatment. By fusing multiparametric MRI data with genomic and transcriptome characteristics, this study will create a strong radiogenomic framework for breast cancer. We will use cutting-edge machine learning techniques to extract pertinent imaging features and look into the relationships between imaging phenotypes and molecular changes. Examples of these techniques are convolutional neural networks (CNN) and graph convolutional networks (GCN). In addition, survival analysis will be
used to find relevant prognostic and predictive biomarkers. Our method is unique in the way it combines many data sources, giving us a thorough understanding of the intricate relationships between imaging and genomic traits and their effects on the development of breast cancer and how well it responds to treatment. Patients, clinicians, and researchers will all significantly benefit from the suggested radiogenomic model for breast cancer. Patients will receive more accurate prognostications and individualised treatment plans, which will increase survival rates and decrease adverse effects associated with the treatment. Doctors will have a greater grasp of the biology and variety of tumours, enabling them to make wise treatment choices. The discovery of novel biomarkers will help researchers by creating new opportunities for developing targeted treatments. Ultimately, this study will have a big impact on the development of precision oncology medicine.
SCIENCE
Faculty: Department:
MATHEMATICAL AND PHYSICAL SCIENCES
Developing a real-time musical note detection and transcription system using machine learning algorithms
Note recognition and transcription are fundamental tasks in music analysis and production. Accurately identifying and transcribing notes can provide valuable information about the composition, structure and style of music. Traditional methods for transcribing music by hand are time-consuming, labour-intensive and error-prone. Therefore, there is a growing need for an automatic real-time note recognition and transcription system using machine learning algorithms. Music companies and academic institutions will greatly benefit from the development of such a system. Musicologists can use this system to study the evolution of music over time, identify patterns and trends in musical compositions, and analyse and compare different music styles and cultures. Despite the potential benefits, it is challenging to develop a real-time system that finds musical notes and transcribes them accurately and reliably. Using machine learning algorithms, the project aims to develop a real-time system for detecting and transcribing musical notes. Several stages of the system will be involved, including the processing of audio signals, the extraction of features, and the detection and transcription of notes. Feature extraction will be performed after the audio signal has been preprocessed and converted into a suitable format. Pitch, duration, and amplitude will be extracted from the audio signal during the feature extraction stage. Machine learning algorithms will classify and
transcribe musical notes based on the extracted features in the note detection and transcription stage. The system will be trained using a large dataset of annotated musical notes to learn the patterns and characteristics of different musical styles and instruments. The system will also incorporate advanced signal processing techniques, such as time-frequency analysis and spectral analysis, to enhance the accuracy and robustness of note detection and transcription. The system provides valuable information about the structure and composition of music, enabling new forms of music analysis and production. In addition, the system has practical applications in the music industry and academia, providing a more effective and efficient way to analyse and produce music. Developing a real-time system for detecting and tracing musical notes has several benefits to society and the community. It provides new insights into different musical styles and cultures, which can enhance the quality and diversity of music. In addition to improving efficiency and cost-effectiveness, it allows for more accurate and efficient music analysis, innovation and creativity in the music industry. In addition, the system can be used in music education, allowing students to learn and analyse music more effectively. Overall, this project has the potential to contribute to the music industry, academia and society significantly.
SCIENCE
Faculty: Department:
MATHEMATICAL AND PHYSICAL SCIENCES
Machine Learning Based Software Quality Assessment for Object Oriented Technology
This research discusses the importance of measuring, preserving and growing software quality due to the increasing complexity and reliance on software in daily goods and services. It proposes that computer metrics can offer a quantitative way of monitoring software characteristics and that software quality prediction can be approached as a problem of classification or idea learning within the framework of machine learning. This research provides a context for the application of machine learning methods in large software organizations to evaluate and forecast software quality. This study uses the ISO 15939 measurement information model to demonstrate how different software metrics can be used to create a quality model of software that meets the needs of these organizations. It also documents the effective use of machine learning approaches for software quality evaluation. The importance of maintaining and growing software quality in today’s world, where software plays a crucial role in almost every aspect of life, is highlighted. Software metrics provide a quantitative way of monitoring the different characteristics of software systems, and software quality prediction can be approached as a problem of classification or idea learning using machine learning methods. The paper presents a general context for the application of machine learning in large software organizations
for evaluating and forecasting software quality using the ISO 15939 measurement information model. The models are based on observable attributes and may use statistical or logical methods, such as decision trees or rule sets, for estimation. Software metrics have been used for a long time to track and monitor software processes, assess, and increase software quality, and are an essential part of everyday work activities in large software development organizations. This paper proposes the use of machine learning methods for the evaluation and prediction of software quality within the ISO/IEC15939 knowledge model system. The paper emphasizes the importance of software maintainability as a critical characteristic of software quality and highlights the role of software metrics in predicting maintenance using different tools and processes. It also discusses how businesses can benefit from developing easily maintainable software to reduce the cost and effort of software maintenance, which accounts for a significant portion of the overall software development expense. The paper concludes that maintainability should be considered a primary characteristic of consistency in a system and that a lot of research work has been carried out to define the attributes or factors that bear on the efforts required to make specified modifications to calculate maintainability.
SCIENCE
Faculty: Department:
MATHEMATICAL AND PHYSICAL SCIENCES
Emotions are vital in human communication and can be conveyed through various modalities, including speech and nonverbal cues. With the growing interest in developing automated systems that can recognize and categorize emotions based on these modalities, this project proposes a deep learning model based on convolutional neural networks (CNNs) for analyzing and categorizing sounds of emotions. The proposed model will be trained on a large dataset of audio samples labelled with one of the five different emotions - surprise, anger, calm, disgust, or sad. The audio samples will be converted into spectrograms to preprocess the data, which will serve as input to the CNN model. This approach is effective for recognizing emotions as the spectrogram reveals the frequency information and distribution of speech sounds. The CNN model is well suited for this task because of its ability to identify patterns and features from the input data. The large dataset of audio samples of human speech comes from Kaggle, containing many emotions. The audio clips have an average length of 3 seconds. The dataset is preprocessed by converting the audio samples into spectrograms, which serve as input to the CNN model. Moreover, the audio samples will be transcribed to obtain text, and it will be used as a second source during this step. This will allow us to discover emotional patterns in speech and achieve multimodal speech recognition later. The
CNN model will be trained in Tensorflow from scratch (meaning without using pre-trained models). Next, the model will be fine-tuned on the emotion recognition task and optimize the hyperparameters to achieve the best possible performance. To evaluate the performance of the CNN model, I will conduct experiments on a test set of audio samples and measure the model’s accuracy, precision, recall, and F1 score. These metrics are essential to evaluate the model’s effectiveness in recognizing and categorizing emotions accurately. The expected output will be emotion class. In conclusion, the proposed CNN-based approach for analyzing and categorizing sounds of emotions is a significant step towards automated soundbased emotion recognition systems. The approach has the potential to be a valuable tool in various applications, including emotion recognition in human-computer interaction, social robotics, and mental health diagnosis and treatment. As the technology continues to evolve, emotion recognition’s potential applications and benefits will only continue to expand. The approach is a significant step towards automated sound-based emotion recognition systems. However, several challenges need to be addressed. For example, the model’s performance may vary when tested on a dataset different from the one on which it was trained.
SCIENCE
Faculty: Department:
MATHEMATICAL AND PHYSICAL SCIENCES
Nitrogen fixing genetics of bacteria isolated from superworm larvae Zophobas morio
Nitrogen is one of the most important elements for plant growth. This chemical contributes to high growth of roots and leaves and can be used as a natural fertilizer that does not cause pollution. There are various sources of nitrogen and one of them is frass from the larvae of Super worms, Zophobia morio, that contain nitrogen-fixing bacteria. These bacteria live in superworm guts and are able to produce nitrogen by consuming polystyren. Nitrogen is found in excrements of
the superworms. Polystyren itself is often used in human everyday life. However, this compound causes pollution and its burning emits high amounts of toxic gasses. In this study we will determine which bacteria derived from the larvae of superworms are responsible for nitrogen fixation using DNA extraction method and PCR. This study will contribute to decrease in pollution by polystyrene and produce nitrogen that can be used as an organic fertilizer.
Faculty: Department:
BIOLOGICAL AND ENVIRONMENTAL SCIENCES SCIENCE
For this project I will be doing research about Machine Learning Algorithms applied on the drug discovery. The importance of this research is due that, every time the viruses, and diseases evolve and get more resistant to current treatments, so it has a lot of importance the research on new drugs to keep our society healthy as possible. The main role of machine learning on here is important because we are using algorithms to analyze dataset, like for example, there is a common practice of using the Random Forest algorithm that is used to improve affinity prediction between ligand and the protein by virtual screening through selecting molecular descriptors, based on a training data set of enzymes. In my case, I will be using a regression model in Python for predicting the solubility of molecules. My methodology for this project is quite simple, I plan to do extensive research on the most common algorithms used for drug discovery (Random Forest, Naïve Bayesian, Support Vector Machine, and the Regression Model) and use them for comparative purposes. After that my objective is to reach conclusions or modify these types of algorithms to improve their performance. As the practical part of my project, I will test out on a Python code
a regression model for predicting the solubility of molecules. The code main source is from the Data Professor on YouTube, Sr Developer Advocate, and ex-Professor of Bioinformatics Chanin Nantasenamat. I took as a reference his work from the” Bioinformatics Tutorial Series” and applied his code for my investigation. Solubility of drugs is an important physicochemical property in drug discovery, design, and development. What this project and research will benefit to the community is on most of ways on the medical and pharmaceutical aspect. Because my research will be focused on drug discovery and social wise it won’t affect a lot, except that health involves all of us, which is also a heavy cause of the importance of my project. And I think that this research it needs to be done because nowadays we need the help of drugs more often for our daily life. And specially the American society relies a lot on drugs for their normal life. So, in the case the drugs are here to improve our lives and their consume will increase, we need an investment on discovering which ones are the bets ones and to keep researching on new ones that can perform better than the ones we already have.
SCIENCE
Faculty: Department:
MATHEMATICAL AND PHYSICAL SCIENCES
Machine learning applications for drug discovery and its implementation on predicting the solubility of molecules
Precious Onyejose, Ha Tran
The Goldbach conjecture, first formulated by Christian Goldbach in 1742, is one of the most studied problems in additive number theory. It states that every even number greater than 4 can be expressed as a sum of two odd primes [1]. While believed to be true, a formal proof has yet to be provided. This conjecture has been verified for all even numbers up to 4.0×10^{18} by Tomas Oliveira e Silva, Siegfried Herzog and Silvio Pardi. In this
study, we aimed to use an extensive computation method to verify the conjecture by developing a python code using given algorithms similarly to the one done in [2]. Our result obtained from the experiment as well as the result in provides strong empirical evidence for the validity of the conjecture and a practical approaching one of the most challenging problems in number theory.”
SCIENCE
Faculty: Department:
MATHEMATICAL AND PHYSICAL SCIENCES
Why the Predator-Prey Body Size Ratio is Higher in Pelagic Food Webs: from the Perspective of 3-Dimensional Foraging
Marine food webs often display a higher predator-prey body size ratio than their terrestrial counterparts, and numerous hypotheses have been proposed to explain this difference. In this study, I suggest an alternative explanation based on the spatial dimensions of foraging arenas. Terrestrial animals typically forage in two-dimensional spaces, whereas marine animals mostly forage in three-dimensional environments. To explore this idea, I conducted simulations and found that marine predators with threedimensional foraging environments are more efficient hunters than their two-dimensional terrestrial counterparts. This is due, in part,
to the fact that marine prey often form larger swarms or aggregations, which increases the efficiency of predator foraging. To delve deeper into this phenomenon, I developed an adaptive dynamical model based on the Lotka-Volterra equations, with the predator-prey ratio as an adaptive variable. My model predicts that high predator foraging efficiency will lead to the evolution of greater predator-prey ratios. Therefore, marine food webs, which typically have three-dimensional foraging environments and higher predator foraging efficiency, will tend to have a higher predator-prey ratio than terrestrial food webs.
Faculty: Department:
SCIENCES SCIENCE
BIOLOGICAL
Neha Tholar, Sergey Butakov
Android Malware Analysis Lab
The main objective of the Android Malware Analysis lab development project is to provide an opensource based free platform for analyzing Android-based malware. The analysis process is supported through dynamic and static tools along with an emulator which aids the simulation of the Android device. MobSF is the tool used and functions along with Android SDK to conduct dynamic tests in a sandboxed environment of malware files. The features of the lab image file are portability, customizability, flexibility, and accessibility. Portability provides the provision of being able to move it to a different cloud or host it on a local virtual machine. The flexibility attribute highlights dynamic memory provisioning and expands
the malware samples available. Customization provisions additional installation of tools. Accessibility permits a user to access the image through different methods. The malware samples part of this lab is made available through Androzoo. The targeted audience for the project is students or entry-level malware analysts. The lab provides an easy way to study malware analysis in a safe and structured setup accompanied by easy-to-use manuals.
Keywords— Android malware, malware analysis, cloud appliance, malware lab, MobSF, Android tools, Android SDK, apk file analysis, Cloud image, malware samples, Android emulator, analysis lab
Faculty: Department:
MASTER OF INFORMATION SYSTEMS SECURITY MANAGEMENT MANAGEMENT
Thursday, April 20, 2023
Mayuri Ravichandran, Gayatri Uttwani, Eslam AbdAllahEnhancing Named Data Networking (NDN) Security from DDoS Attacks Using Software Defined Networking (SDN)
Controllers Named Data Networking (NDN) and Software Defined
Networking (SDN) are two evolving network technologies that promise to overcome the increasing demand for content delivery, scalability, mobility, networking, and current security issues on the Internet. NDN is a named-based architecture, instead of the IP addresses the name of the object i.e., Named Data Object (NDO) is used. Data retrieval works on interest and data packets. The consumer requests the interest packet based on named data, which is routed in the network and fetched from the nearest cache router, then it is returned to the client. SDN technology focuses on centralized network management by distancing the control plane from the data plane. By implementing the SDN-NDN-based architecture, the security of the NDN gets enhanced and strengthened by using the programmability feature of the SDN as the packets are forwarded through the SDN controllers by centralized management. Distributed
Denial of Service (DDoS) attacks would always remain a serious threat to the confidentiality, integrity and availability of the networks and Interest flooding has been identified as a type of DDoS attack in NDN. We summarized the survey of security attacks on NDN and SDN and analyzed the Interest Flooding DDoS attack scenario on an eleven-node NDN architecture along with its experimental metrics such as Throughput, No of Satisfied Interests and Number of Impacted Nodes. Furthermore, through our research, we implemented a hundred-node NDN architecture model and replicated the DDoS attack scenario along with the output metrics. We were able to integrate SDN technology with our hundrednode NDN testbed and through the experimental output metrics we achieved our objective of enhancing NDN security from DDoS as we demonstrated a 35% improvement in the overall performance metrics of NDN.
Faculty: Department:
MASTER OF INFORMATION SYSTEMS SECURITY MANAGEMENT MANAGEMENT
Enhancing Security in RFID Using Klein Algorithm
The significance and application of Radio Frequency Identification (RFID) have been continuously increasing across many industries. But at the same time, protecting the privacy and security of data related to RFID has been a major concern and challenge over the last few years. Attacking RFID systems which has no appropriate security measures has become relatively easier with all the help of technological advancements. Hence, in order to address these security concerns in RFID and to maintain the proper flow of communication between the RFID reader and tag, the best security methodologies were compared and it seems to be that lightweight
cryptographic are the best available method to ensure security in RFID systems. This is because RFID systems are usually low and small and lightweight cryptographic algorithms can be implemented which are suitable for resourceconstrained devices. Hence, in this research paper, initially, all the lightweight cryptographic algorithms were analyzed and then Klein algorithm has been chosen to implement with the RFID system for its low power consumption. The paper also includes the software simulation of RFID system integrated with Klein in order to enhance the security of the system.
Faculty: Department:
MASTER OF INFORMATION SYSTEMS SECURITY MANAGEMENT MANAGEMENT
Blockchain-based Digital Assets Financial Reporting and Valuation
Blockchain-based digital assets, particularly cryptocurrencies and nonfungible tokens (NFTs), have recently gained significant popularity due to their unique attributes, benefits, and challenges. This book chapter aims to provide a comprehensive understanding of digital assets’ financial reporting and valuation. The chapter is divided into three sections, with the first section presenting an overview of digital assets, highlighting the attributes, benefits, and challenges of cryptocurrencies and NFTs. The second section delves into the valuation and financial reporting of digital assets, including the regulatory frameworks for cryptocurrencies and NFTs, tax implications for these assets, and the types of valuation methodology used. The final section explores the future prospects of cryptocurrencies and NFTs, considering their global impact and adoption on the economy. Cryptocurrencies represent fungible digital assets that can be exchanged for other assets or currencies, while NFTs are unique digital assets that represent ownership of a specific asset, such as artwork or music. In terms of valuation, the chapter discusses various methodologies used to value digital assets, including the cost, market, and income approaches, and how these methods may be adapted to account for the unique characteristics of digital assets. Furthermore, the chapter provides insights into digital assets’ financial and corporate reporting requirements and how they may differ
Faculty: Department:
from traditional financial reporting standards. The regulatory framework for digital assets is also discussed, with a focus on the current regulatory environment and the need for clarity and consistency in regulations to promote investor confidence and protect consumers. The chapter also explores the impact of taxation on cryptocurrencies and NFTs, highlighting the complexities of tax regulations and compliance considerations for digital assets. It examines the current tax regulations and their potential impact on the valuation and reporting of digital assets. Finally, the chapter discusses the future prospects of cryptocurrencies and NFTs, considering their potential impact on the global economy, including their role in disrupting traditional financial systems and promoting financial inclusion. The chapter provides recommendations and best practices for stakeholders to consider when investing, valuing, or reporting digital assets. In conclusion, this book chapter provides a comprehensive overview of the financial reporting and valuation of blockchain-based digital assets, including cryptocurrencies and NFTs. It highlights the need for greater clarity and consistency in regulatory frameworks and provides insights into the potential impact of taxation on these assets. The chapter also offers recommendations and best practices for stakeholders to consider when dealing with digital assets.
MASTER OF INFORMATION SYSTEMS ASSURANCE MANAGEMENT MANAGEMENT
The Most Important Human Resources Topic Today: Equity, Diversity, and Inclusion (EDI)
There are many challenges facing human resource management (HRM). Some of these challenges include attracting and retaining talent, maintaining employee engagement and morale, and improving work-life balance through opportunities for remote work. The purpose of this project is to bring awareness to the issue at the forefront of HRM: Equity, Diversity, and Inclusion (EDI). EDI practices have come a long way from where it first began in the 1960’s. It has evolved with influence from legislation, social activism movements, and social media. Today, EDI is used as an umbrella term that refers to the programs, strategies, practices, and policies that reflect an organization’s effort to create an environment that respects and accommodates each individual employee. It is important to distinguish that equality is providing everyone with the same resources, while equity means fulfilling the unique needs of each employee. Diversity is the acceptance of differences, and the appreciation for what makes us unique. Inclusion means fostering an environment in which employees feel like they belong to, and are comfortable in. For this project, information from the Chartered Professionals in Human Resources of Alberta website, Academy to Innovate Human Resources website, and other relevant and credible articles have been reviewed in order to help establish EDI
Faculty: Department:
as the most important topic in HRM not only today, but in the future as well. Successful EDI policies will contribute to sustainable organizational practices and help to combat many issues, such as actions of bullying, harassment, and violence in the workplace, increasing the mental health of employees, and improving the overall work culture of the organization. In the process of creating a more equitable, diversified, and inclusive work environment, HRM must be sure to avoid tokenistic practices. Tokenistic practices include the action of only hiring a small number of candidates in order to present the illusion of a more diverse workforce. HRM must look within their own practices and policies to be certain there are no barriers facing recruits. They must listen to employee concerns and opinions, and then implement the necessary changes to help support them. EDI does not look the same in every organization; instead, HRM and the organization are able to collaborate in order to determine what practices will best improve the worklife of employees. EDI is a long-term strategy that will take years to fully develop, but it is a key aspect to appealing to, and retaining potential candidates and stakeholders. In order to stay relevant and competitive, equity, diversity, and inclusion must be fully embraced by the organization; it is not only the smart thing to do, it is the right thing to do.
MANAGEMENT MANAGEMENT
Designing and Securing Wi-Fi Connected Autonomous Vehicle
An autonomous vehicle (AV), or a driverless vehicle, is one in which key components such as steering, speed, and braking are controlled automatically by the vehicle and no human intervention is needed. This increases the efficiency of driving, the safety of the passengers, and comfort level. As the use of autonomous vehicles increases, the scope and severity of dangers will also expand. While the cars’ connection to linked technologies like the cloud via Wi-Fi enables increased speed and service quality, it also introduces additional dangers in the form of attacks by threat actors seeking to use these channels for their advantage. While wireless connectivity and cloud computing enable the provision of a diverse variety of dynamic resources, security is widely viewed as a major risk in cloud-connected automobiles. Our project aims to demonstrate the designing and building of autonomous vehicle, including neural network training using machine learning. Furthermore, our project focusses on how to secure wireless connectivity of self-driving cars by detecting deauthentication and Man-in-the-Middle attacks. This is achieved by creating different attack scenarios, performing attack analysis, and applying mitigation strategies. For building autonomous vehicle, we have used Raspberry Pi as the main processing unit acting as master device and sending instructions to Arduino Uno
Faculty: Department:
(slave device). Sign and image detection is achieved through the concept of neural network training used in machine learning.
To demonstrate the deauthentication and man-in-the-middle attack, the Kali Linux machine is used as the attacking machine. Kali has various built-in tools for wireless hacking. For scanning and decrypting wireless networks, Kali Linux comes with an Aircrack-ng and Airodump-ng suite of programs which is used by our team for executing deauthentication attack. To illustrate man-in-the-middle attack, we used two tools namely; Arpspoof and Ettercap tool. Arpspoof is a CLI (command line based) based tool which is used to perform this attack. The autonomous vehicle is the target of an ARP (address resolution protocol) spoofing attack launched by arpspoof. As a result, all traffic between the AV and the default gateway passes through the attacker machine, making it possible to record it using Wireshark. This traffic will also be forwarded by arpspoof. Ettercap is a GUI-based tool that may put the attacker during two machines and then allow the attacker to spoof domain name server. Based on our analysis, enabling virtual private network (VPN) is the best mitigation technique against man-in-the-middle attacks and in case of deauthentication attacks, using Wi-Fi protected access 3rd generation (WPA3) was found to be best the solution.
MASTER OF INFORMATION SYSTEMS SECURITY MANAGEMENT MANAGEMENT
Application of Crystal Kyber on AWS Cloud
Cloud storage is an increasingly popular means of storing data due to its flexibility, scalability, and affordability. However, with sensitive data being stored on the cloud at an increasing rate, security and privacy have become major concerns. To ensure the privacy and security of data stored in the cloud, several security requirements must be met, including confidentiality, integrity, availability, and authenticity. Confidentiality can be achieved through encryption and access controls, integrity can be ensured through data backups and checksums, availability can be guaranteed through redundancy and load balancing, and authenticity can be achieved through digital signatures and timestamps. To comply with data protection regulations and standards, cloud storage providers and users must work together to implement robust security measures that protect data from unauthorized access, tampering, or theft. Cryptographic libraries like CRYSTAL’S Kyber can be used to offer reliable and effective security solutions for cloud storage, with an emphasis on postquantum security to prevent attacks from quantum computers.
The method used in this project involved creating an AWS server and coding in Python using Visual Studio. The Python code was written to interact with AWS services, such as Amazon S3, and to perform cryptographic operations using the CRYSTAL’S Kyber library. To test the functionality and performance of the
Faculty: Department:
code, an application called Postman was used to make HTTP requests and obtain results. Overall, this method allowed for efficient development and testing of the cryptographic functionality on an AWS server, with the added benefit of being able to easily scale up the system if needed
315ms
In conclusion, CRYSTAL’S Kyber has proven to be faster and more secure than RSA in terms of encryption and decryption time and resistance against attacks. These results are promising, especially given that they were obtained on classical computers, which suggests that Kyber is an excellent candidate for post-quantum computing. With its focus on post-quantum security, Kyber offers a range of algorithms that have been designed to be both efficient and secure, making it a reliable cryptographic tool for cloud storage and other applications. Overall, Kyber’s performance and security capabilities make it a suitable replacement for RSA and other traditional cryptographic methods in the post-quantum era.
MASTER OF INFORMATION SYSTEMS SECURITY MANAGEMENT MANAGEMENT
Radio Frequency Identification Implants in the Healthcare Industries
Radio Frequency Identification (RFID) technology has been recognized as a promising solution for improving patient safety and management in the healthcare industry. In particular, RFID implants can address issues such as patient identification, medication administration, and insufficient medical diagnosis during emergency situations. This paper proposes an RFID-based architecture for Emergency Medical Services (EMS) to tackle these problems.
Medical emergencies are a longstanding issue in the healthcare sector, and outside of hospitals, identifying patients and accessing their medical records can be challenging. Inaccurate diagnoses and mistakes in the prescription, administration, and distribution of medication can also occur. RFID technology uses radio signals to exchange identifying data between devices and can be used to address these issues.
This paper proposes an architecture for RFID implants in healthcare systems, which allows for the efficient, automated, and immediate collection and transmission of patient data without human involvement. The architecture includes a database that stores patient medical records, which can be accessed by authorized medical personnel using RFID readers. The proposed architecture also includes security and privacy
Faculty: Department:
measures to protect patient data from potential attacks and vulnerabilities.
To analyze the security vulnerabilities of the proposed architecture, this paper identifies potential attacks and proposes security solutions to mitigate these risks. The proposed security measures include access control, encryption, and authentication protocols.
The main goal of the proposed architecture is to ensure the security and privacy of medical data while providing medical teams with quick access to patient information. This can help emergency teams make informed decisions about patient care, including medication administration and saving patients’ lives.
In conclusion, RFID implants have the potential to improve patient safety and management in the healthcare industry. The proposed architecture for RFID implants in healthcare systems offers a promising solution to address the challenges faced by EMS in emergency situations. Security and privacy measures are essential to protect patient data from potential attacks and vulnerabilities. Future research can explore the implementation of RFID technology in other areas of the healthcare industry and evaluate its effectiveness in improving patient care.
MASTER OF INFORMATION SYSTEMS SECURITY MANAGEMENT MANAGEMENT
Exploring Current Solutions Against DDoS Attacks in SDN Environment
Software Defined Networking (SDN) offers a novel approach to network management, with the potential to simplify administration and enhance security. However, as with any emerging technology, it comes with vulnerabilities that may impact its availability and functionality. A prevalent attack on SDN controllers is Distributed Denial of Service (DDoS), which, while not entirely preventable, can have its effects mitigated. This study explores the factors influencing the performance of industry
solutions in defending SDN against DDoS attacks and evaluates their efficiency. It investigates various security challenges associated with DDoS attacks in both Information Technology (IT) and Operational Technology (OT) environments and contrasts the effects of distinct DDoS attack types on multiple levels of SDN communication, such as Northbound-Southbound and EastWest. Lastly, the applicable solution’s effectiveness was assessed based on the controller’s hardware utilization and response time.
Faculty: Department:
MASTERS OF INFORMATION SYSTEMS SECURITY MANAGEMENT MANAGEMENT
Performance Evaluation in named data networking under content poisoning and cache pollution attack in Flashlight topology
In this 21st century, handling data is essential for the improvement of delivery, security and performance. Professionals are coming up with ideas to decrease the usage of devices in a network so that the information can deliver as quickly as possible even under any security challenges. Information centric networking(ICN) comes in order to increase the overall performance rather than traditional networking. Information centric networking is a network in which it is completely away from the conventional networking where one may have a user, a web server, or different routers and access the web server with the routers and illustrate the direction to the web server. There are lots of security issues in the process of data delivery in ICN which is still an area for contribution for researchers. The main purpose of this paper is to discuss the results under content poisoning and cache pollution attack in a self-assembled topology of 96 consumers, 5 producers and 10 routers named as flashlight topology. Besides, the
mitigation techniques for those attacks will be discussed in this paper. In a large topology of 96 consumers, Ndnsim simulator have been used for the simulation under the general caching policies which are least recently used and first in first out. The content poisoning and cache pollution attack has also been effectively generated which depicts the challenges in the delivery path. The results of data delivery in normal caching policy along with performance have been discussed. The differentiation between the content poisoning attack and cache pollution attack have been articulated with the evaluation parameters. Overall, Information Centric Networking (ICN) provides efficient content retrieval applications so that the traffic problem can be sorted out. Proper in-network caches are the base of effective content retrieval in ICN. This paper discussed the results under security attacks and the mitigation processes on the basis of generated results which will be essential in the future.
Faculty: Department:
MASTERS OF INFORMATION SYSTEMS SECURITY MANAGEMENT MANAGEMENT
Securing RFID-Based Attendance Management Systems: An Implementation of the Block Cipher Algorithm
Radio Frequency Identification Technology’s main feature is its ability to locate, track and monitor people or objects. Radio Frequency Identification Technology Architecture consists of two major features: a tag and a reader. Radio Frequency Identification Technology is prone to major security and privacy attacks. This research aims to demonstrate the importance of encryption in securing Radio Frequency Identification-based attendance management systems, with a specific focus on using the Advanced Encryption Standard block cipher. The application of the cipher
algorithm in encrypting the Radio Frequency Identification Technology tag unique identifier and the reader will be implemented to show how it can enhance the security of tags. The proposed approach ensures secure data transmission, preventing unauthorized access and ensuring the confidentiality and integrity of collated attendance data. This research provides insights into the importance of encryption in securing Radio Frequency Identification Technology-based systems, which is critical in protecting sensitive data and ensuring the privacy and security of individuals.
Faculty: Department:
MASTERS OF INFORMATION SYSTEMS SECURITY MANAGEMENT MANAGEMENT
Thursday, April 20, 2023
Alan Ling, Sergey ButakovSecurity and Privacy Considerations for Self-Sovereign ID in Metaverse Health Care Applications
Major IT companies and the research community are working on implementing the Metaverse ecosystem in different sectors including digital healthcare. With the raise of concern about personal data privacy protection, people are expecting they will have more control over how third parties are able to use their medical data. This paper aims at the development of
tools that can facilitate healthcare services to be carried out in the Metaverse ecosystem in a safe and trusted way. The project proposes a trust framework for digital healthcare applications to run on the Metaverse environment. The framework is based on the use of Self-Sovereign Identity (SSI) and has been tested on the design principles outlined for SSI-based architecture.
Faculty: Department:
MANAGEMENT MANAGEMENT
Self-Sovereign Identity in Blockchain Technology
This book chapter provides an overview of self-sovereign identity (SSI) as a revolutionary approach to digital identity management. The chapter begins by tracing the evolution of digital identity management from centralized identity to SSI. The need for a safe, sustainable, and trustworthy digital identity in the virtual world is emphasized. SSI is differentiated from decentralized identity management, and the chapter examines SSI’s identity protocol and architecture, including verifiable credentials, decentralized identifiers, decentralized identity, and decentralized key management systems.
The chapter outlines twenty extended principles upon which SSI systems should be based, including sovereignty, data access control, data storage control, decentralization, verifiability, recovery, cost-free, security, privacy, safeguard, flexibility, accessibility, availability, transparency, portability, interoperability, scalability, sustainability, and longevity. These principles provide a comprehensive framework for establishing SSI systems.
The chapter further discusses the benefits and challenges of SSI systems. Benefits include data
Faculty: Department:
privacy in healthcare, efficient financial services, and staff identification systems. Challenges include lack of legal recognition, user acceptance, interoperability, and scalability. Additionally, the chapter explores the use cases of SSI systems, such as part lifecycle support, competency assurance, Know Your Customers, Non-Fungible Tokens (NFT), and authentication, authorization, and trust of Internet of things user/devices. Finally, the chapter provides recommendations for implementing SSI systems in practice. The recommendations include implementing a usercentric design, ensuring open standards and interoperability, engaging stakeholders, developing trust frameworks, establishing governance and policy frameworks, providing education and awareness programs, and developing secure and sustainable systems.
In conclusion, this book chapter provides a comprehensive overview of self-sovereign identity (SSI) as a revolutionary shift in digital identity management. It examines the identity protocol and architecture of SSI, along with the benefits, challenges, and applications of the SSI system. It also outlines a set of principles and recommendations for establishing and implementing SSI systems in practice.
MANAGEMENT MANAGEMENT
Blockchain Security Considerations
This book chapter provides a comprehensive overview of the security considerations associated with blockchain technology and discusses the major attacks and potential solutions to mitigate these challenges. The chapter covers security in blockchain technology, OWASP top ten vulnerabilities, blockchain attributes for trustworthiness, attack vectors such as user level, network level, system level, and smart contracts, and blockchain-related attacks at the attack vectors. It also suggests mitigation strategies for different attacks and best practices for companies to mitigate risks caused by these attacks. The chapter highlights some of the significant security concerns associated with blockchain technology. These include 51% attacks, smart contract vulnerabilities, weaknesses in the underlying code, and denial-
of-service attacks. The chapter emphasizes that these security concerns are particularly important for smaller or newer blockchains that may not have been thoroughly tested or audited. Therefore, developers and users of blockchain technology must be aware of these potential security risks and take appropriate measures to mitigate them. The book chapter offers several potential solutions to address these security challenges. These include using a hybrid consensus mechanism, employing multi-signature transactions, and regular code reviews and security audits. The chapter also suggests best practices for companies to mitigate risks caused by these attacks, such as providing regular training to employees, regularly reviewing blockchain security, and implementing a robust incident response plan.
Faculty: Department:
MASTER OF INFORMATION SYSTEM SECURITY MANAGEMENT MANAGEMENT
Telemedicine to Reduce Poverty Using Blockchain
Telemedicine has revolutionized healthcare delivery by allowing remote consultations and diagnosis through digital platforms. On the other hand, blockchain technology provides secure and transparent storage and sharing of medical records, making it an attractive tool for healthcare providers. The integration of telemedicine and blockchain has the potential to reduce poverty by providing efficient and cost-effective care to patients in impoverished areas. This book chapter explores the benefits and challenges of using telemedicine and blockchain in poverty reduction, and potential solutions to overcome those challenges. The chapter discusses the evolution of telemedicine and its integration with blockchain technology. It also highlights the use cases of blockchain technology in healthcare, such as secure storage of medical records and supply chain management. Additionally, it presents various telemedicine and blockchain initiatives implemented worldwide and their impact on poverty reduction. One of the major benefits of the integration of telemedicine and blockchain is increased access to affordable healthcare services. Non-profit organizations providing telemedicine services to patients in impoverished areas have been able to reach more
Faculty: Department:
patients, resulting in improved health outcomes. The chapter also outlines the benefits of blockchain technology in healthcare, including increased security, transparency, and interoperability of medical records. The chapter discusses the application of blockchain technology in telemedicine, including the use of smart contracts and decentralized platforms for secure and transparent consultations. It also proposes a telemedicine architecture using blockchain technology, highlighting the best practices for organizations to set up a telemedicine blockchain. Despite the potential benefits of the integration of telemedicine and blockchain, there are several challenges that must be addressed, such as the lack of infrastructure and technical expertise in impoverished areas. The chapter proposes potential solutions, such as public-private partnerships and capacity building programs, to overcome these challenges. In conclusion, the integration of telemedicine and blockchain has the potential to revolutionize healthcare delivery and reduce poverty worldwide. The chapter provides valuable insights for healthcare providers, policymakers, and researchers interested in using these technologies to improve healthcare access and affordability in impoverished areas.
MASTER OF INFORMATION SYSTEM SECURITY MANAGEMENT MANAGEMENT
Identifying factors that influence online learning student satisfaction
Background: The pandemic has forced universities to offer online learning to help students continue with their education. What are their perceptions of online learning? How satisfied are students with online vs. in-person learning.
Purpose: Identify the factors that influence online learning student satisfaction.
Method: Literature review helped develop a questionnaire with three key constructs and pre-tested. Data was collected via Facebook and emails. A total of 112 students responded with 38% of respondents from Concordia University of Edmonton, 18% MacEwan, 12% University of Alberta and 32% from emails.
Results: Students view (1) unfair to charge regular fees in an online environment, (2) in-person learning is more beneficial to mental health and well-being, (3) in-person learning provides better access to professors.
Conclusion: The managerial implications are for the university to (1) Reduced tuition for online courses, (2) implement a mix of asynchronous and synchronous for all online courses, and (3) provide professional development for instructors to improve online delivery.
Faculty: Department: