Full Paper View Go Back

Investigating The Security and Privacy Issues in ChatGPT Usage and Their Impact on Organisational and Individual Security

Polra Victor Falade1

  1. Dept. of Cyber Security, Nigerian Defence Academy, Kaduna, Nigeria.

Section:Research Paper, Product Type: Journal-Paper
Vol.10 , Issue.3 , pp.19-30, Mar-2024


Online published on Mar 31, 2024


Copyright © Polra Victor Falade . This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
 

View this paper at   Google Scholar | DPI Digital Library


XML View     PDF Download

How to Cite this Paper

  • IEEE Citation
  • MLA Citation
  • APA Citation
  • BibTex Citation
  • RIS Citation

IEEE Style Citation: Polra Victor Falade, “Investigating The Security and Privacy Issues in ChatGPT Usage and Their Impact on Organisational and Individual Security,” International Journal of Scientific Research in Multidisciplinary Studies , Vol.10, Issue.3, pp.19-30, 2024.

MLA Style Citation: Polra Victor Falade "Investigating The Security and Privacy Issues in ChatGPT Usage and Their Impact on Organisational and Individual Security." International Journal of Scientific Research in Multidisciplinary Studies 10.3 (2024): 19-30.

APA Style Citation: Polra Victor Falade, (2024). Investigating The Security and Privacy Issues in ChatGPT Usage and Their Impact on Organisational and Individual Security. International Journal of Scientific Research in Multidisciplinary Studies , 10(3), 19-30.

BibTex Style Citation:
@article{Falade_2024,
author = {Polra Victor Falade},
title = {Investigating The Security and Privacy Issues in ChatGPT Usage and Their Impact on Organisational and Individual Security},
journal = {International Journal of Scientific Research in Multidisciplinary Studies },
issue_date = {3 2024},
volume = {10},
Issue = {3},
month = {3},
year = {2024},
issn = {2347-2693},
pages = {19-30},
url = {https://www.isroset.org/journal/IJSRMS/full_paper_view.php?paper_id=3436},
publisher = {IJCSE, Indore, INDIA},
}

RIS Style Citation:
TY - JOUR
UR - https://www.isroset.org/journal/IJSRMS/full_paper_view.php?paper_id=3436
TI - Investigating The Security and Privacy Issues in ChatGPT Usage and Their Impact on Organisational and Individual Security
T2 - International Journal of Scientific Research in Multidisciplinary Studies
AU - Polra Victor Falade
PY - 2024
DA - 2024/03/31
PB - IJCSE, Indore, INDIA
SP - 19-30
IS - 3
VL - 10
SN - 2347-2693
ER -

36 Views    40 Downloads    7 Downloads
  
  

Abstract :
Artificial intelligence (AI) technologies, particularly AI-driven chatbots, have become integral to modern communication and task automation. Among these, ChatGPT, developed by OpenAI, has garnered significant attention as a widely accessible natural language processing chatbot. However, concerns regarding security and privacy have accompanied its widespread adoption. This research paper employs blog mining as a methodology to provide a comprehensive analysis of the security and privacy implications surrounding ChatGPT. By extracting insights from relevant blog posts, combined with an examination of existing literature, this study identifies inherent vulnerabilities within the platform and explores potential threats that could exploit these weaknesses. Specifically, the research highlights the risks associated with the mishandling of personal information, the susceptibility to cyber-attacks, and the broader implications for individual privacy and organizational security. Moreover, the paper offers insights into the regulatory landscape governing AI chatbots and suggests future research directions to address these challenges effectively. Ultimately, this study underscores the critical importance of implementing robust security measures and regulatory frameworks to mitigate the risks associated with AI-driven chatbots and ensure their responsible deployment in the digital age.

Key-Words / Index Term :
ChatGPT, security issues, Privacy issues, OpenAI, AI, Vulnerabilities, threats

References :
[1] S. Addington, “ChatGPT: Cyber Security Threats and Countermeasures,” pp.1–12, 2023.
[2] M. Alawida, S. Mejri, A. Mehmood, B. Chikhaoui, and O. I. Abiodun, “A Comprehensive Study of ChatGPT?: Advancements, Limitations, and Ethical Considerations in Natural Language Processing and Cybersecurity,” Inf. J., 2023.
[3] P. V. Falade, “Decoding the Threat Landscape?: ChatGPT, FraudGPT, and WormGPT in Social Engineering Attacks,” Int. Sci. Res. Comput. Sci. Eng. Inf. Technol., October 2023, doi: 10.32628/CSEIT2390533.
[4] A. Qammar, H. Wang, J. Ding, A. Naouri, M. Daneshmand, and H. Ning, “Chatbots to ChatGPT in a Cybersecurity Space: Evolution, Vulnerabilities, Attacks, Challenges, and Future Recommendations,” J. Latex Cl. Files, Vol.14, No.8, pp.1–17, 2023.
[5] G. Sebastian, “Do ChatGPT and Other AI Chatbots Pose a Cybersecurity Risk?? An Exploratory Study,” Int. J. Secur. Priv. Pervasive Comput., Vol.15, No.1, pp.1–11, 2023, doi: 10.4018/IJSPPC.320225.
[6] M. M. Mijwil, M. Aljanabi, and A. H. Ali, “ChatGPT?: Exploring the Role of Cybersecurity in the Protection of Medical Information Intro to ChatGPT,” Mesopotamian J. Cybersecurity, vol. 2023, pp.18–21, 2023.
[7] J. McGinnis, “Is ChatGPT Safe for Organizations to Use?,” 2023.
[8] A. Hetler, “What is generative AI? Everything you need to know,” 2023.
[9] M. V. Jaworski and C. H. Patel, “Your Employees Are Using ChatGPT and Other LLMs: Risks and Legal Implications of ChatGPT in the Workplace,” 2023.
[10] P. Chanda, “Does ChatGPT Save Data? What You Need to Know (2023),” 2023.
[11] C. Mauran, “ChatGPT rolls out important privacy options,” 2023.
[12] G. Moody, “Poland Opens GDPR Investigation into ChatGPT and OpenAI amid Mounting Privacy Concerns,” 2023.
[13] R. Neubauer, “Do Free AI Tools Pose a Security Risk to Your Business?” 2023.
[14] M. Malik, “Technical and Legal Risks of ChatGPT: How prepared are we with Laws on AI?” 2023.
[15] M. T. Renaud, M. McConihe, and N. Liu, “Nothing for free – the real costs of ChatGPT,” 2023.
[16] S. S. Mustufa, “ChatGPT and Data Privacy,” 2023.
[17] D. C and P. J, “ChatGPT and large language models: what’s the risk?” 2023.
[18] G. WU, “8 Big Problems With OpenAI’s ChatGPT,” 2023.
[19] C. Zakrzewski, “FTC investigates OpenAI over data leak and ChatGPT’s inaccuracy,” 2023.
[20] M. K. Nagothu, “Integrating ChatGPT & Generative AI Within Cybersecurity Best Practices,” 2023.
[21] K. Gilani, “Is ChatGPT Safe? 3 Hidden Risks and 5 Pro Tips (July 2023),” 2023.
[22] Aparna, “ChatGPT: Analyzing the Security Risks and Ensuring User Safety,” 2023.
[23] Aljazeera, “OpenAI launches business version of ChatGPT after blowback over privacy,” 2023.
[24] F. L. Somoye, “Is ChatGPT safe? Security and privacy risks considered,” 2023.
[25] M. Dixit, “ChatGPT’s incognito mode? Users can disable chat history to protect data,” 2023.
[26] T. Jackson, “Exploring The Security Risks Of Generative AI,” 2023.
[27] C. Castro, “ChatGPT: a privacy nightmare?” 2023.
[28] A. Zacharakos, “ChatGPT users at risk for credential theft,” 2023.
[29] J. D. Neuburger, “ChatGPT Risks and the Need for Corporate Policies,” 2023.
[30] M. Burgess, “ChatGPT Has a Plugin Problem,” 2023.
[31] P. Wagenseil, “Security risks of ChatGPT and other AI text generators,” 2023.
[32] A. News, “How Googlers cracked OpenAI’s ChatGPT with a single word,” 2023.
[33] M. Jackson, “Why ChatGPT is a security concern for your organization (even if you don’t use it),” 2023.
[34] G. Moody, “ChatGPT Is a Privacy Disaster Waiting To Happen, Find Out Why,” 2023.
[35] C. Rodrigues, “The Risks of Using ChatGPT to Write Client-Side Code,” 2023.
[36] C. Thorbecke, “Don’t tell anything to a chatbot you want to keep private,” 2023.
[37] TREND, “4M Accounts Compromised in #LilyCollinsHack: Fake ChatGPT Apps & Websites Alert,” 2023.
[38] C. P. Team, “New ChatGPT4.0 Concerns: A Market for Stolen Premium Accounts,” 2023.
[39] P. V. Falade and P. O. Momoh, “Evaluating the Permissions of Monitoring Mobile Applications for Remote Employees?: Analysing the Impact on Employer Trust and Employee Privacy Concerns,” Int. J. Sci. Res. Comput. Sci. Eng., February, Vol.12, pp.42–52, 2024.
[40] E. Montalbano, “ChatGPT Hallucinations Open Developers to Supply Chain Malware Attacks,” 2023.

Authorization Required

 

You do not have rights to view the full text article.
Please contact administration for subscription to Journal or individual article.
Mail us at  support@isroset.org or view contact page for more details.

Go to Navigation