Articles

  • Jan 24, 2025 | dx.doi.org | Yang Zhao |Yanan Wang |Zhipeng Yu |Jingwei Wang

  • Nov 22, 2024 | ceramics.onlinelibrary.wiley.com | Guanjie Li |Danyang Liu |Ying Liu |Jingwei Wang

    CONFLICT OF INTEREST STATEMENT The authors declare no conflicts of interest. REFERENCES 1. UHPC in the US highway infrastructure. In: F Toutlemonde, J Resplendino, editors. Designing and building with UHPFRC. Great Britian: John Wiley & Sons; 2011. p. 221–234. https://doi.org/10.1002/9781118557839 2, , , , , , et al. New development of ultra-high-performance concrete (UHPC). Compos Part B: Eng. 2021; 224:109220. https://doi.org/10.1016/j.compositesb.2021.109220 3, .

  • Jul 2, 2024 | preprints.org | Jingwei Wang

    PreprintArticleVersion 1This version is not peer-reviewedVersion 1: Received: 28 June 2024 / Approved: 2 July 2024 / Online: 2 July 2024 (14:52:49 CEST)Wang, J. Exploring Vulnerabilities in BERT Models. Preprints 2024, 2024070204. https://doi.org/10.20944/preprints202407.0204.v1Wang, J. Exploring Vulnerabilities in BERT Models. Preprints 2024, 2024070204. https://doi.org/10.20944/preprints202407.0204.v1 Export citation file:Bib TeX|RISMDPI and ACS StyleWang, J. Exploring Vulnerabilities in BERT Models.

  • Jun 29, 2024 | preprints.org | Jingwei Wang

    PreprintArticleVersion 1This version is not peer-reviewedVersion 1: Received: 28 June 2024 / Approved: 28 June 2024 / Online: 29 June 2024 (06:22:54 CEST)Wang, J. An Embarrassingly Simple Method to Compromise Language Models. Preprints 2024, 2024062045. https://doi.org/10.20944/preprints202406.2045.v1Wang, J. An Embarrassingly Simple Method to Compromise Language Models. Preprints 2024, 2024062045.

  • Jun 24, 2024 | preprints.org | Jingwei Wang

    PreprintArticleVersion 1This version is not peer-reviewedVersion 1: Received: 24 June 2024 / Approved: 24 June 2024 / Online: 24 June 2024 (13:53:36 CEST)Wang, J. Analyzing Multi-Head Attention on Broken BERT Models. Preprints 2024, 2024061669. https://doi.org/10.20944/preprints202406.1669.v1Wang, J. Analyzing Multi-Head Attention on Broken BERT Models. Preprints 2024, 2024061669. https://doi.org/10.20944/preprints202406.1669.v1 Export citation file:Bib TeX|RISMDPI and ACS StyleWang, J.

Contact details

Socials & Sites

Try JournoFinder For Free

Search and contact over 1M+ journalist profiles, browse 100M+ articles, and unlock powerful PR tools.

Start Your 7-Day Free Trial →