About 657 results
AI Overview
Generating...
Sponsored
• AdSense Integration Active
www.reddit.com
reddit.com › r › europ..._reservists_trained ›
...
arxiv.org
arxiv.org › abs › 2509.04516v2
As large language models (LLMs) expand multilingual capabilities, questions remain about the equity of their performance across languages. While many communities stand to benefit from AI systems, the ...
arxiv.org
arxiv.org › abs › 2404.12407v1
The era of pre-trained models has ushered in a wealth of new insights for the machine learning community. Among the myriad of questions that arise, one of paramount importance is: 'Do pre-trained mode...
Sponsored
• AdSense Integration Active
arxiv.org
arxiv.org › abs › 2512.13552v1
This work introduces {\it PrahokBART}, a compact pre-trained sequence-to-sequence model trained from scratch for Khmer using carefully curated Khmer and English corpora. We focus on improving the pre-...
arxiv.org
arxiv.org › abs › 1910.04760v4
Large, pre-trained generative models have been increasingly popular and useful to both the research and wider communities. Specifically, BigGANs a class-conditional Generative Adversarial Networks tra...
research.google
research.google › blog › h...underwater-mysteries
Discover how Google researchers repurposed AI models trained on bird songs to detect whale calls and track marine health. Explore the future of bioacoustic monitoring.
arxiv.org
arxiv.org › abs › 2602.08818v1
Recent advances in mixture-of-experts architectures have shown that individual experts models can be trained federatedly, i.e., in isolation from other experts by using a common base model to facilita...
arxiv.org
arxiv.org › abs › 2312.09052v1
We aim to investigate if we can improve predictions of stress caused by OCD symptoms using pre-trained models, and present our statistical analysis plan in this paper. With the methods presented in th...
arxiv.org
arxiv.org › abs › 2203.09127v3
Pre-trained models (PTMs) have become a fundamental backbone for downstream tasks in natural language processing and computer vision. Despite initial gains that were obtained by applying generic PTMs ...
arxiv.org
arxiv.org › abs › 2508.10160v1
Neural decoding of pathological and physiological states can enable patient-individualized closed-loop neuromodulation therapy. Recent advances in pre-trained large-scale foundation models offer the p...
arxiv.org
arxiv.org › abs › 2307.04765v1
In this study, the performances of the Whisper-Small and Wav2Vec2-XLS-R-300M models which are two pre-trained multilingual models for speech to text were examined for the Turkish language. Mozilla Com...
arxiv.org
arxiv.org › abs › 2209.04252v1
We propose a novel method for generating high-resolution videos of talking-heads from speech audio and a single 'identity' image. Our method is based on a convolutional neural network model that incor...
arxiv.org
arxiv.org › abs › 2308.02019v2
We present our submission to the BabyLM challenge, whose goal was to improve the sample efficiency of language models. We trained an ensemble consisting of a GPT-2 and small LLaMA models on the develo...
arxiv.org
arxiv.org › abs › 2211.00151v3
Pre-trained language models (PLMs) may fail in giving reliable estimates of their predictive uncertainty. We take a close look into this problem, aiming to answer two questions: (1) Do PLMs learn to b...
arxiv.org
arxiv.org › abs › 2006.07322v5
Modern neural architectures for classification tasks are trained using the cross-entropy loss, which is widely believed to be empirically superior to the square loss. In this work we provide evidence ...
arxiv.org
arxiv.org › abs › 2512.11146v1
Using newly-assembled data from 1980 through 2024, we show that 25% of scientifically-active, US-trained STEM PhD graduates leave the US within 15 years of graduating. Leave rates are lower in the lif...
arxiv.org
arxiv.org › abs › 1911.09249v1
Purpose: We propose a 2.5D deep learning neural network (DLNN) to automatically classify thigh muscle into 11 classes and evaluate its classification accuracy over 2D and 3D DLNN when trained with lim...
arxiv.org
arxiv.org › abs › 2412.09441v2
Class-Incremental Learning (CIL) requires models to continually acquire knowledge of new classes without forgetting old ones. Despite Pre-trained Models (PTMs) have shown excellent performance in CIL,...
www.reddit.com
reddit.com › r › StarW..._trained_force_user ›
Here I will seek to establish that Jar Jar Binks, far from being simply the bumbling idiot he portrays himself as, is in fact a highly skilled force user in terms of martial ability and mind control. ...
arxiv.org
arxiv.org › abs › 2511.01066v2
We present an ongoing initiative to provide open, very large, high-quality, and richly annotated textual datasets for almost 200 languages. At 30 trillion tokens, this is likely the largest generally ...
www.bing.com Bing
bing.com › ck › a?!&am...10cmFpbmVk&ntb=1
爱词霸权威在线词典,为您提供trained的中文意思,trained的用法讲解,trained的读音,trained的同义词,trained的反义词,trained的例句等英语服务。
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › Training
tools, equipment, documents or materials that trainees will use when fully trained. On-the-job training has a general reputation as most effective for vocational
www.reddit.com Reddit
reddit.com › r › Publi..._hanoi_train_street ›
...
github.com GitHub
github.com › google-research › bert
TensorFlow code and pre-trained models for BERT (⭐ 39882)
github.com HackerNews
github.com › DGoettlich › history-llms
Points: 897 | Comments: 421 | Author: iamwil
arxiv.org arXiv
arxiv.org › abs › 2206.00832v2
Benchmarking the tradeoff between neural network accuracy and training time is computationally expensive. Here we show how a multiplicative cyclic learning rate schedule can be used to construct a tra...
www.bing.com Bing
bing.com › ck › a?!&am...cmFpbmVkKw&ntb=1
爱词霸权威在线词典,为您提供Trained 的中文意思,Trained 的用法讲解,Trained 的读音,Trained 的同义词,Trained 的反义词,Trained 的例句等英语服务。
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › ...-trained_transformer
generative pre-trained transformers to generate text, such as Gemini, DeepSeek and Claude. GPTs are primarily used to generate text, but can be trained to generate
www.reddit.com Reddit
reddit.com › r › fight...es_a_street_fighter ›
...
github.com GitHub
github.com › jakearchibald › trained-to-thrill
Trains! Yey! (⭐ 330)
www.latent.space HackerNews
latent.space › p › reza-shabani#details
Points: 891 | Comments: 220 | Author: swyx
arxiv.org arXiv
arxiv.org › abs › 2512.13687v1
The quality of the latent space in visual tokenizers (e.g., VAEs) is crucial for modern generative models. However, the standard reconstruction-based training paradigm produces a latent space that is ...
www.bing.com Bing
bing.com › ck › a?!&am...TQlQkElQkE&ntb=1
金山词霸致力于为用户提供高效、精准的在线翻译服务,支持中、英、日、韩、德、法等177种语言在线翻译,涵盖即时免费的AI智能翻译、英语翻译、俄...
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › Train
A train (from Old French trahiner, from Latin trahere, "to pull, to draw") is a series of connected vehicles that run along a railway track and transport
www.reddit.com Reddit
reddit.com › r › Whatc...a_firing_range_with ›
...
github.com GitHub
github.com › loujie0822 › Pre-trained-Models
预训练语言模型综述 (⭐ 548)
arxiv.org arXiv
arxiv.org › abs › 2005.07202v3
Pre-training large-scale neural language models on raw texts has made a significant contribution to improving transfer learning in natural language processing (NLP). With the introduction of transform...
www.bing.com Bing
bing.com › ck › a?!&am...hpbWF0ZWx5&ntb=1
金山词霸致力于为用户提供高效、精准的在线翻译服务,支持中、英、日、韩、德、法等177种语言在线翻译,涵盖即时免费的AI智能翻译、英语翻译、俄...
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › Train_Dreams_%28film%29
Train Dreams is a 2025 American period drama film directed by Clint Bentley, who co-wrote the screenplay with Greg Kwedar, based on the 2011 novella by
github.com GitHub
github.com › neonbjb › tortoise-tts
A multi-voice TTS system trained with an emphasis on quality (⭐ 14817)
arxiv.org arXiv
arxiv.org › abs › 2109.04912v1
We present ReasonBert, a pre-training method that augments language models with the ability to reason over long-range relations and multiple, possibly hybrid contexts. Unlike existing pre-training met...
www.bing.com Bing
bing.com › ck › a?!&am...WZpY2FsbHk&ntb=1
Students are placed in small groups with counselors trained seniors on campus; they have access to cultural and ethnic affinity (联系) groups, tutoring centers and also have a summer orientation …
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › This_Train
"This Train", also known as "This Train Is Bound for Glory", is a traditional African-American gospel song first recorded in 1922. Although its origins
www.reddit.com Reddit
reddit.com › r › mildl...to_only_come_inside ›
She started singing to him as a joke instead of just calling him. Now he only responds to the song and ignores me otherwise. I’ll be outside while my neighbors BBQ and need to sing this stupid song....
github.com GitHub
github.com › tesseract-ocr › tessdata
Trained models with fast variant of the "best" LSTM models + legacy models (⭐ 7406)
arxiv.org arXiv
arxiv.org › abs › 1103.0540v1
Multifarious image enhancement algorithms have been used in different applications. Still, some algorithms or modules are imperfect for practical use. When the image enhancement modules have been fixe...
www.bing.com Bing
bing.com › ck › a?!&am...1lZGljaW5l&ntb=1
GPs are trained in general medicine but are not specialists in any particular subject. 这些医生都是经过医药常识训练而非专业领域的专家. 互联网 英英释义
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › ChatGPT
developed by OpenAI. It was released in November 2022. It uses generative pre-trained transformers (GPTs), such as GPT-5.2, to generate text, speech, and images
www.reddit.com Reddit
reddit.com › r › iamat...h_someone_untrained ›
...
github.com GitHub
github.com › snakers4 › silero-vad
Silero VAD: pre-trained enterprise-grade Voice Activity Detector (⭐ 8348)
arxiv.org arXiv
arxiv.org › abs › 2004.06165v5
Large-scale pre-training methods of learning cross-modal representations on image-text pairs are becoming popular for vision-language tasks. While existing methods simply concatenate image region feat...
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › Trained_band
A nineteenth-century dictionary says, under "Train": train-band, i.e. train'd band, a band of trained men, Cowper, John Gilpin, st. I, and used by Dryden
www.reddit.com Reddit
reddit.com › r › fight...her_one_was_trained ›
...
github.com GitHub
github.com › AI4Finance-Foundation › FinGPT
FinGPT: Open-Source Financial Large Language Models! Revolutionize 🔥 We release the trained model on HuggingFace. (⭐ 18730)
arxiv.org arXiv
arxiv.org › abs › 2405.14365v1
Mathematical reasoning is an important capability of large language models~(LLMs) for real-world applications. To enhance this capability, existing work either collects large-scale math-related texts ...
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › A_Train
Look up Train or train in Wiktionary, the free dictionary. A Train may refer to: The A (New York City Subway service) A Division (New York City Subway)
www.reddit.com Reddit
reddit.com › r › ChatG...hine_lets_see_proof ›
Yeah, we're all in the death business now that OpenAI has succumbed to the corrupt Department of War.
Let's see proof of your cancellation boys and girls. ...
github.com GitHub
github.com › tesseract-ocr › tessdata_best
Best (most accurate) trained LSTM models. (⭐ 1508)
arxiv.org arXiv
arxiv.org › abs › 2302.07142v2
This letter proposes a semantic importance-aware communication (SIAC) scheme using pre-trained language models (e.g., ChatGPT, BERT, etc.). Specifically, we propose a cross-layer design with a pre-tra...
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › Train_Train
Train Train may refer to: Train+Train, a 2000 Japanese light novel series Train*Train, a manga by Eiki Eiki "Train, Train", a song by Blackfoot "Train
www.reddit.com Reddit
reddit.com › r › TikTo..._the_25yearold_they ›
...
github.com GitHub
github.com › openvinotoolkit › open_model_zoo
Pre-trained Deep Learning models and demos (high quality and extremely fast) (⭐ 4361)
arxiv.org arXiv
arxiv.org › abs › 1909.03564v2
This study compares the effectiveness and robustness of multi-class categorization of Amazon product data using transfer learning on pre-trained contextualized language models. Specifically, we fine-t...
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › The_Train
The Train may refer to: The Train (1964 film), an American war film starring Burt Lancaster and Paul Scofield The Train (1970 film), an Indian Hindi suspense
www.reddit.com Reddit
reddit.com › r › nextf..._recalls_his_boxing ›
...
github.com GitHub
github.com › zai-org › GLM-130B
GLM-130B: An Open Bilingual Pre-Trained Model (ICLR 2023) (⭐ 7669)
arxiv.org arXiv
arxiv.org › abs › 2211.05187v1
Transformers have become central to recent advances in computer vision. However, training a vision Transformer (ViT) model from scratch can be resource intensive and time consuming. In this paper, we ...
