Help | Advanced Search
Artificial Intelligence
Authors and titles for recent submissions.
- Fri, 15 Nov 2024
- Thu, 14 Nov 2024
- Wed, 13 Nov 2024
- Tue, 12 Nov 2024
- Mon, 11 Nov 2024
See today's new changes
Fri, 15 Nov 2024 (showing first 50 of 106 entries )
Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.
- View all journals
- Explore content
- About the journal
- Publish with us
- Sign up for alerts
Computer science articles within Nature Computational Science
News Feature | 24 September 2024
The lost data: how AI systems censor LGBTQ+ content in the name of safety
Many AI companies implement safety systems to protect users from offensive or inaccurate content. Though well intentioned, these filters can exacerbate existing inequalities, and data shows that they have disproportionately removed LGBTQ+ content.
- Sophia Chen
Research Briefing | 19 July 2024
A multi-task learning strategy to pretrain models for medical image analysis
Pretraining powerful deep learning models requires large, comprehensive training datasets, which are often unavailable for medical imaging. In response, the universal biomedical pretrained (UMedPT) foundational model was developed based on multiple small and medium-sized datasets. This model reduced the amount of data required to learn new target tasks by at least 50%.
Article 19 July 2024 | Open Access
Overcoming data scarcity in biomedical imaging with a foundational multi-task model
UMedPT, a foundational model for biomedical imaging, has been trained on a variety of medical tasks with different types of label. It has achieved high performance with less training data in various clinical applications.
- Raphael Schäfer
- , Till Nicke
- & Fabian Kiessling
Article | 19 February 2024
Automated discovery of algorithms from data
Automated algorithm discovery has been difficult for artificial intelligence given the immense search space of possible functions. Here explainable neural networks are used to discover algorithms that outperform those designed by humans.
- Paul J. Blazek
- , Kesavan Venkatesh
- & Milo M. Lin
Correspondence | 21 December 2023
Using proprietary language models in academic research requires explicit justification
- Alexis Palmer
- , Noah A. Smith
- & Arthur Spirling
Research Highlight | 20 December 2023
One algorithm to play them all
- Fernando Chirigati
Research Briefing | 18 December 2023
A transformer method that predicts human lives from sequences of life events
Transformer methods are revolutionizing how computers process human language. Exploiting the structural similarity between human lives, seen as sequences of events, and natural-language sentences, a transformer method — dubbed life2vec — has been used to create rich vector representations of human lives, from which accurate predictions can be made.
Review Article | 21 November 2023
Designing molecules with autoencoder networks
Autoencoders are versatile tools for molecular informatics with the opportunity for advancing molecule and drug design. In this Review, the authors highlight the active areas of development in the field and explore the challenges that need to be addressed moving forward.
- Agnieszka Ilnicka
- & Gisbert Schneider
Research Highlight | 15 November 2023
A full-stack platform for spiking deep learning
Memory and computation together at last.
Comment | 26 October 2023
Building open-source AI
Artificial intelligence (AI) drives innovation across society, economies and science. We argue for the importance of building AI technology according to open-source principles to foster accessibility, collaboration, responsibility and interoperability.
- Yash Raj Shrestha
- , Georg von Krogh
- & Stefan Feuerriegel
Editorial | 10 October 2023
Ada Lovelace, a role model for the ages
Ada Lovelace Day celebrates women in STEM careers, but also raises awareness of the challenges that women have faced in science, as well as the importance of female role models in STEM.
Q&A | 10 October 2023
Laying the foundations of programming and system design
Dr Barbara Liskov — a mostly retired Institute Professor at the Massachusetts Institute of Technology, a pioneer in object-oriented programming and distributed systems and the winner of the 2008 ACM A. M. Turing Award, which is the highest distinction in computer science — talks to Nature Computational Science about her work on data abstractions, her career trajectory and recognizing the contributions of women in computer science.
- Ananya Rastogi
Brief Communication 05 October 2023 | Open Access
Human-like intuitive behavior and reasoning biases emerged in large language models but disappeared in ChatGPT
The reasoning capabilities of OpenAI’s generative pre-trained transformer family were tested using semantic illusions and cognitive reflection tests that are typically used in human studies. While early models were prone to human-like cognitive errors, ChatGPT decisively outperformed humans, avoiding the cognitive traps embedded in the tasks.
- Thilo Hagendorff
- , Sarah Fabi
- & Michal Kosinski
Research Highlight | 17 August 2023
Optimal crystal structure solutions
Perspective | 24 July 2023
Designing equitable algorithms
While the adherence to fairness constraints has become common practice in the design of algorithms across many contexts, a more holistic approach should be taken to avoid inflicting additional burdens on individuals in all groups, including those in marginalized communities.
- Alex Chohlas-Wood
- , Madison Coots
- & Julian Nyarko
Research Highlight | 20 April 2023
Moving toward safer driverless vehicles
Research Highlight | 23 January 2023
Large language model for molecular chemistry
News & Views | 05 January 2023
Dimensionality reduction under scrutiny
A framework for generating and interpreting dynamic visualizations from traditional static dimensionality reduction visualization methods has been proposed in a recent study.
- , Zewen K. Tuong
- & Di Yu
Resource | 30 December 2022
Dynamic visualization of high-dimensional data
Data visualization is widely used in science, but interpreting such visualizations is prone to error. Here a dynamic visualization is introduced for capturing more information and improving the reliability of visual interpretations.
- Eric D. Sun
- & James Zou
Research Highlight | 08 December 2022
Artificial agents that negotiate and reach agreements
- Iryna Omelchenko
Article | 28 November 2022
Combining computational controls with natural text reveals aspects of meaning composition
A neural network-based language model of supra-word meaning, that is, the combined meaning of words in a sentence, is proposed. Analysis of functional magnetic resonance imaging and magnetoencephalography data helps identify the regions of the brain responsible for understanding this meaning.
- Mariya Toneva
- , Tom M. Mitchell
- & Leila Wehbe
Editorial | 18 July 2022
Mathematics, the queen of sciences
We highlight how this year’s awardees from some of the most important prizes in mathematics have had an impact in the computational science community.
Editorial | 23 May 2022
Cracking the code review process
What does it entail to perform a code review for Nature Computational Science ?
Q&A | 01 May 2022
Fighting hate speech and misinformation online
Dr Srijan Kumar, assistant professor at Georgia Institute of Technology and a Forbes 30 Under 30 honoree in science, discusses with Nature Computational Science how he uses machine learning and data science to identify and mitigate malicious activities on online platforms, including misinformation and anti-Asian hate speech.
Research Highlight | 21 April 2022
Gauging urban development with neural networks
Editorial | 21 April 2022
And the Turing Award goes to…
The 2021 A. M. Turing Award celebrates Jack Dongarra’s contributions in high-performance computing, which have had a significant impact in computational science.
Q&A | 21 April 2022
A key player in high-performance computing
Dr Jack Dongarra, Distinguished Professor at the University of Tennessee and recipient of the 2021 A. M. Turing Award, spoke with Nature Computational Science about his contributions to high-performance computing (HPC) and his insights into the future of this field.
Q&A | 01 February 2022
Crypto and technology for the people
Dr Seny Kamara, associate professor at Brown University, talks to Nature Computational Science about his current research focus on the intersection between social responsibility and cryptography/technology, as well as about the need for multidisciplinary work in this arena.
Perspective | 31 January 2022
Opportunities for neuromorphic computing algorithms and applications
There is still a wide variety of challenges that restrict the rapid growth of neuromorphic algorithmic and application development. Addressing these challenges is essential for the research community to be able to effectively use neuromorphic computers in the future.
- Catherine D. Schuman
- , Shruti R. Kulkarni
- & Bill Kay
Resource 27 January 2022 | Open Access
The fast continuous wavelet transformation (fCWT) for real-time, high-quality, noise-resistant time–frequency analysis
The authors present an open-source framework that enables fast and accurate time–frequency analysis of signals and demonstrate it on real-world applications, such as signals from the brain–computer interface.
- Lukas P. A. Arts
- & Egon. L. van den Broek
Research Highlight | 21 January 2022
Machine learning to guide mathematicians
News & Views | 25 November 2021
Advancing data compression via noise detection
Compressing scientific data is essential to save on storage space, but doing so effectively while ensuring that the conclusions from the data are not affected remains a challenging task. A recent paper proposes a new method to identify numerical noise from floating-point atmospheric data, which can lead to a more effective compression.
- Dorit M. Hammerling
- & Allison H. Baker
Article 25 November 2021 | Open Access
Compressing atmospheric data into its real information content
Climate data are often stored at higher precision than is needed. The proposed compression automatically determines the precision from the data’s bitwise real information, removing any false information and leading to a more efficient compression.
- Milan Klöwer
- , Miha Razinger
- & Tim N. Palmer
Research Highlight | 12 November 2021
Accuracy and fairness go hand in hand
Accurate short-term precipitation prediction.
Correspondence | 11 October 2021
Democratizing interactive, immersive experiences for science education with WebXR
- Fabio Cortés Rodríguez
- , Matteo Dal Peraro
- & Luciano A. Abriata
Article | 22 September 2021
Explainable neural networks that simulate reasoning
The authors demonstrate how neural systems can encode cognitive functions, and use the proposed model to train robust, scalable deep neural networks that are explainable and capable of symbolic reasoning and domain generalization.
Research Highlight | 15 July 2021
Detection of war destruction from satellite images
Article | 24 June 2021
The power of quantum neural networks
A class of quantum neural networks is presented that outperforms comparable classical feedforward networks. They achieve a higher capacity in terms of effective dimension and at the same time train faster, suggesting a quantum advantage.
- Amira Abbas
- , David Sutter
- & Stefan Woerner
Research Highlight | 09 June 2021
Accurate and efficient fluid flow analysis
Comment | 13 May 2021
New views of black holes from computational imaging
The unique challenges associated with imaging a black hole motivated the development of new computational imaging algorithms. As the Event Horizon Telescope continues to expand, these algorithms will need to evolve to keep pace with the increasingly demanding volume and dimensionality of the data.
- Kazunori Akiyama
- , Andrew Chael
- & Dominic W. Pesce
News & Views | 25 March 2021
Efficient deep learning
The computational complexity of deep neural networks is a major obstacle of many application scenarios driven by low-power devices, including federated learning. A recent finding shows that random sketches can substantially reduce the model complexity without affecting prediction accuracy.
- Shiqiang Wang
Article | 25 March 2021
Random sketch learning for deep neural networks in edge computing
Developing lightweight deep neural networks, while essential for edge computing, still remains a challenge. Random sketch learning is a method that creates computationally efficient and compact networks, thus paving the way for deploying tiny machine learning (TinyML) in resource-constrained devices.
- , Peijun Chen
- & Jun Zhang
Research Highlight | 17 March 2021
Porting software without affecting scientific results
Article | 01 February 2021
Larger GPU-accelerated brain simulations with procedural connectivity
Spiking neural network simulations are very memory-intensive, limiting large-scale brain simulations to high-performance computer systems. Knight and Nowotny propose using procedural connectivity to substantially reduce the memory footprint of these models, such that they can run on standard GPUs.
- James C. Knight
- & Thomas Nowotny
Editorial | 14 January 2021
Celebrating today, inspiring tomorrow
Computational and mathematical models are at the core of myriad research developments across different domains. We pay tribute to the importance of computational science by providing a dedicated home among the Nature portfolio for this inspiring field.
Perspective | 14 January 2021
Quantifying causality in data science with quasi-experiments
While estimating causality from observational data is challenging, quasi-experiments provide causal inference methods with plausible assumptions that can be practical to a range of real-world problems.
- , Lyle Ungar
- & Konrad Kording
Browse broader subjects
- Systems biology
- Mathematics and computing
Quick links
- Explore articles by subject
- Guide to authors
- Editorial policies
Subscribe to the PwC Newsletter
Join the community, latest research, the denotational semantics of ssa.
imbrem/debruijn-ssa • 14 Nov 2024
Static single assignment form, or SSA, has been the dominant compiler intermediate representation for decades.
Programming Languages Logic in Computer Science F.3.2; D.3.4; D.3.1
PIMCOMP: An End-to-End DNN Compiler for Processing-In-Memory Accelerators
sunxt99/pimcomp-nn • 14 Nov 2024
How to deploy DNNs onto PIM-based accelerators is the key to explore PIM's high performance and energy efficiency.
Hardware Architecture
In Serverless, OS Scheduler Choice Costs Money: A Hybrid Scheduling Approach for Cheaper FaaS
ZhaoNeil/hybrid-scheduler • 13 Nov 2024
We present evidence that relying on the default Linux CFS scheduler increases serverless workloads cost by up to 10X.
Distributed, Parallel, and Cluster Computing
CorrectBench: Automatic Testbench Generation with Functional Self-Correction using LLMs for HDL Design
autobench/correctbench • 13 Nov 2024
The comparative analysis demonstrates that our method achieves a pass ratio of 70. 13% across all evaluated tasks, compared with the previous LLM-based testbench generation framework's 52. 18% and a direct LLM-based generation method's 33. 33%.
Software Engineering
Learning-Based Control Barrier Function with Provably Safe Guarantees: Reducing Conservatism with Heading-Aware Safety Margin
bassamlab/sigmarl • 13 Nov 2024
We propose a learning-based Control Barrier Function (CBF) to reduce conservatism in collision avoidance of car-like robots.
Robotics Multiagent Systems Systems and Control Systems and Control
RINO: Accurate, Robust Radar-Inertial Odometry with Non-Iterative Estimation
yangsc4063/rino • 12 Nov 2024
Additionally, the approach implements a loosely coupled system between the scanning radar and an inertial measurement unit (IMU), leveraging Error-State Kalman Filtering (ESKF).
Formalization of physics index notation in Lean 4
HEPLean/HepLean • 12 Nov 2024
The physics community relies on index notation to effectively manipulate types of tensors.
Logic in Computer Science High Energy Physics - Phenomenology High Energy Physics - Theory
A Framework for Carbon-aware Real-Time Workload Management in Clouds using Renewables-driven Cores
tharindu-b-hewage/openstack-gc • 12 Nov 2024
To this end, we present a framework to harvest green renewable energy for real-time workloads in cloud systems.
SoliDiffy: AST Differencing for Solidity Smart Contracts
mojtaba-eshghie/SoliDiffy • 12 Nov 2024
Smart contracts, primarily written in Solidity, are integral to blockchain software applications, yet precise analysis and maintenance are hindered by the limitations of existing differencing tools.
Software Engineering Programming Languages
Web-Based Simulator of Superscalar RISC-V Processors
Sekky61/riscv-sim • 12 Nov 2024
Mastering computational architectures is essential for developing fast and power-efficient programs.
IMAGES
COMMENTS
Computer science is the study and development of the protocols required for automated processing and manipulation of data. This includes, for example, creating algorithms for efficiently searching ...
cs.CY - Computers and Society (new, recent, current month) Covers impact of computers on society, computer ethics, information technology and public policy, legal aspects of computing, computers and education. Roughly includes material in ACM Subject Classes K.0, K.2, K.3, K.4, K.5, and K.7.
arXiv is a free distribution service and an open-access archive for nearly 2.4 million scholarly articles in the fields of physics, mathematics, computer science, quantitative biology, quantitative finance, statistics, electrical engineering and systems science, and economics. Materials on this site are not peer-reviewed by arXiv.
Code Generation for Conic Model-Predictive Control on Microcontrollers with TinyMPC. TinyMPC/TinyMPC • 26 Mar 2024. Conic constraints appear in many important control applications like legged locomotion, robotic manipulation, and autonomous rocket landing. Robotics Systems and Control Systems and Control Optimization and Control.
Read the latest Research articles in Computer science from Nature. ... An analysis of tens of millions of papers shows which fields have embraced AI tools with enthusiasm — and which have been ...
Community Research Earth Digital Intelligence Twin (CREDIT) John Schreck, Yingkai Sha, William Chapman, Dhamma Kimpara, Judith Berner, Seth McGinnis, Arnold Kazadi, Negin Sobhani, Ben Kirk, David John Gagne II. Subjects: Artificial Intelligence (cs.AI); Atmospheric and Oceanic Physics (physics.ao-ph)
Read the latest Research articles in Computer science from Nature Computational Science. ... A recent paper proposes a new method to identify numerical noise from floating-point atmospheric data ...
Explore the latest full-text research PDFs, articles, conference papers, preprints and more on COMPUTER SCIENCE. Find methods information, sources, references or conduct a literature review on ...
Improving the detection of technical debt in Java source code with an enriched dataset. namcyan/tesoro • 8 Nov 2024. To fill such a gap, in this study, through the analysis of comments and their associated source code from 974 Java projects hosted in the Stack corpus, we curated the first ever dataset of TD identified by code comments ...
tencent/tencent-hunyuan-large • • 4 Nov 2024. In this paper, we introduce Hunyuan-Large, which is currently the largest open-source Transformer-based mixture of experts model, with a total of 389 billion parameters and 52 billion activation parameters, capable of handling up to 256K tokens. Logical Reasoning Mathematical Problem-Solving.