• Login
    View Item 
    •   Home
    • WODC Repository
    • WODC Publications
    • View Item
    •   Home
    • WODC Repository
    • WODC Publications
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of WODC RepositoryCommunitiesPublication DateAuthorsTitlesKeywordsProjectSeriesThis CollectionPublication DateAuthorsTitlesKeywordsProjectSeries

    My Account

    LoginRegister

    WODC links

    Website WODCWebsite WODC (English)Zoekhulp/Search Help

    Statistics

    Display statistics

    Stereotypes in ChatGPT - an empirical study

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Thumbnail
    Name:
    OV202301-stereotypes-in-chatgp ...
    Size:
    890.1Kb
    Format:
    PDF
    Description:
    OV202301-volledige tekst
    Download
    Auteur / Author
    Busker, A.L.J.
    Choenni, S.
    Bargh, M.S.
    Organisatie / Institution
    Rotterdam University of Applied Sciences - Research Center Creating 010
    WODC
    Serie / Series
    WODC Rapport OV 202301
    Trefwoorden / Keywords
    Taal
    Discriminatie
    Kunstmatige intelligentie
    Technologische ontwikkeling
    Classificatie
    Onderzoeksmethode
    Onderwijs
    ChatGPT
    Project
    OV202301
    Link naar nieuwsbericht
    https://www.wodc.nl/actueel/nieuws/2023/09/27/chatgpt-als-graadmeter-van-de-maatschappij
    URI
    http://hdl.handle.net/20.500.12832/3301
    
    Metadata
    Show full item record
    Titel / Title
    Stereotypes in ChatGPT - an empirical study
    Samenvatting
    ChatGPT is rapidly gaining interest and attracts many researchers, practitioners and users due to its availability, potentials and capabilities. Nevertheless, there are several voices and studies that point out the flaws of ChatGPT such as its hallucinations, factually incorrect statements, and potential for promoting harmful social biases. Being the focus area of this contribution, harmful social biases may result in unfair treatment or discrimination of (a member of) a social group. This paper aims at gaining insight into social biases incorporated in ChatGPT language models. To this end, we study the stereotypical behavior of ChatGPT. Stereotypes associate specific characteristics to groups and are related to social biases. The study is empirical and systematic, where about 2300 stereotypical probes in 6 formats (like questions and statements) and from 9 different social group categories (like age, country and profession) are posed to ChatGPT.
    Uitgever / Publisher
    Rotterdam University of Applied Sciences - Research Center Creating 010
    Publicatiedatum / publication date
    2023-09-27
    Collections
    WODC Publications

    entitlement

     
    DSpace software (copyright © 2002 - 2026)  DuraSpace
    Quick Guide | Contact Us
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.