Moozonian

About 657 results
AI Overview
Generating...
Sponsored • AdSense Integration Active
arxiv.org
arxiv.org › abs › 2509.04516v2
Artificially Fluent: Swahili AI Performance Benchmarks Between English-Trained and Natively-Trained Datasets
As large language models (LLMs) expand multilingual capabilities, questions remain about the equity of their performance across languages. While many communities stand to benefit from AI systems, the ...
arxiv.org
arxiv.org › abs › 2404.12407v1
TV100: A TV Series Dataset that Pre-Trained CLIP Has Not Seen
The era of pre-trained models has ushered in a wealth of new insights for the machine learning community. Among the myriad of questions that arise, one of paramount importance is: 'Do pre-trained mode...
Sponsored • AdSense Integration Active
arxiv.org
arxiv.org › abs › 2512.13552v1
PrahokBART: A Pre-trained Sequence-to-Sequence Model for Khmer Natural Language Generation
This work introduces {\it PrahokBART}, a compact pre-trained sequence-to-sequence model trained from scratch for Khmer using carefully curated Khmer and English corpora. We focus on improving the pre-...
arxiv.org
arxiv.org › abs › 1910.04760v4
A cost-effective method for improving and re-purposing large, pre-trained GANs by fine-tuning their class-embeddings
Large, pre-trained generative models have been increasingly popular and useful to both the research and wider communities. Specifically, BigGANs a class-conditional Generative Adversarial Networks tra...
research.google
research.google › blog › h...underwater-mysteries
How AI trained on birds is surfacing underwater mysteries
Discover how Google researchers repurposed AI models trained on bird songs to detect whale calls and track marine health. Explore the future of bioacoustic monitoring.
arxiv.org
arxiv.org › abs › 2602.08818v1
FlexMoRE: A Flexible Mixture of Rank-heterogeneous Experts for Efficient Federatedly-trained Large Language Models
Recent advances in mixture-of-experts architectures have shown that individual experts models can be trained federatedly, i.e., in isolation from other experts by using a common base model to facilita...
arxiv.org
arxiv.org › abs › 2312.09052v1
Applying Pre-Trained Deep-Learning Model on Wrist Angel Data -- An Analysis Plan
We aim to investigate if we can improve predictions of stress caused by OCD symptoms using pre-trained models, and present our statistical analysis plan in this paper. With the methods presented in th...
arxiv.org
arxiv.org › abs › 2203.09127v3
ERNIE-GeoL: A Geography-and-Language Pre-trained Model and its Applications in Baidu Maps
Pre-trained models (PTMs) have become a fundamental backbone for downstream tasks in natural language processing and computer vision. Despite initial gains that were obtained by applying generic PTMs ...
arxiv.org
arxiv.org › abs › 2508.10160v1
Pre-trained Transformer-models using chronic invasive electrophysiology for symptom decoding without patient-individual training
Neural decoding of pathological and physiological states can enable patient-individualized closed-loop neuromodulation therapy. Recent advances in pre-trained large-scale foundation models offer the p...
arxiv.org
arxiv.org › abs › 2307.04765v1
Performance Comparison of Pre-trained Models for Speech-to-Text in Turkish: Whisper-Small and Wav2Vec2-XLS-R-300M
In this study, the performances of the Whisper-Small and Wav2Vec2-XLS-R-300M models which are two pre-trained multilingual models for speech to text were examined for the Turkish language. Mozilla Com...
arxiv.org
arxiv.org › abs › 2209.04252v1
Talking Head from Speech Audio using a Pre-trained Image Generator
We propose a novel method for generating high-resolution videos of talking-heads from speech audio and a single 'identity' image. Our method is based on a convolutional neural network model that incor...
arxiv.org
arxiv.org › abs › 2308.02019v2
Baby Llama: knowledge distillation from an ensemble of teachers trained on a small dataset with no performance penalty
We present our submission to the BabyLM challenge, whose goal was to improve the sample efficiency of language models. We trained an ensemble consisting of a GPT-2 and small LLaMA models on the develo...
arxiv.org
arxiv.org › abs › 2211.00151v3
A Close Look into the Calibration of Pre-trained Language Models
Pre-trained language models (PLMs) may fail in giving reliable estimates of their predictive uncertainty. We take a close look into this problem, aiming to answer two questions: (1) Do PLMs learn to b...
arxiv.org
arxiv.org › abs › 2006.07322v5
Evaluation of Neural Architectures Trained with Square Loss vs Cross-Entropy in Classification Tasks
Modern neural architectures for classification tasks are trained using the cross-entropy loss, which is widely believed to be empirically superior to the square loss. In this work we provide evidence ...
arxiv.org
arxiv.org › abs › 2512.11146v1
A Quarter of US-Trained Scientists Eventually Leave. Is the US Giving Away Its Edge?
Using newly-assembled data from 1980 through 2024, we show that 25% of scientifically-active, US-trained STEM PhD graduates leave the US within 15 years of graduating. Leave rates are lower in the lif...
arxiv.org
arxiv.org › abs › 1911.09249v1
Semantic Segmentation of Thigh Muscle using 2.5D Deep Learning Network Trained with Limited Datasets
Purpose: We propose a 2.5D deep learning neural network (DLNN) to automatically classify thigh muscle into 11 classes and evaluate its classification accuracy over 2D and 3D DLNN when trained with lim...
arxiv.org
arxiv.org › abs › 2412.09441v2
MOS: Model Surgery for Pre-Trained Model-Based Class-Incremental Learning
Class-Incremental Learning (CIL) requires models to continually acquire knowledge of new classes without forgetting old ones. Despite Pre-trained Models (PTMs) have shown excellent performance in CIL,...
www.reddit.com
reddit.com › r › StarW..._trained_force_user ›
[Theory] Jar Jar Binks was a trained Force user, knowing Sith collaborator, and will play a central role in The Force Awakens
Here I will seek to establish that Jar Jar Binks, far from being simply the bumbling idiot he portrays himself as, is in fact a highly skilled force user in terms of martial ability and mind control. ...
arxiv.org
arxiv.org › abs › 2511.01066v2
HPLT 3.0: Very Large-Scale Multilingual Resources for LLM and MT. Mono- and Bi-lingual Data, Multilingual Evaluation, and Pre-Trained Models
We present an ongoing initiative to provide open, very large, high-quality, and richly annotated textual datasets for almost 200 languages. At 30 trillion tokens, this is likely the largest generally ...
www.bing.com Bing
bing.com › ck › a?!&am...10cmFpbmVk&ntb=1
trained是什么意思_trained的翻译_音标_读音_用法_例句_爱词霸在线 …
爱词霸权威在线词典,为您提供trained的中文意思,trained的用法讲解,trained的读音,trained的同义词,trained的反义词,trained的例句等英语服务。
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › Training
Training - Wikipedia
tools, equipment, documents or materials that trainees will use when fully trained. On-the-job training has a general reputation as most effective for vocational
www.reddit.com Reddit
reddit.com › r › Publi..._hanoi_train_street ›
Tourist gets hit by train at Hanoi train street
...
github.com GitHub
github.com › google-research › bert
google-research/bert
TensorFlow code and pre-trained models for BERT (⭐ 39882)
github.com HackerNews
github.com › DGoettlich › history-llms
History LLMs: Models trained exclusively on pre-1913 texts
Points: 897 | Comments: 421 | Author: iamwil
arxiv.org arXiv
arxiv.org › abs › 2206.00832v2
Fast Benchmarking of Accuracy vs. Training Time with Cyclic Learning Rates
Benchmarking the tradeoff between neural network accuracy and training time is computationally expensive. Here we show how a multiplicative cyclic learning rate schedule can be used to construct a tra...
www.bing.com Bing
bing.com › ck › a?!&am...cmFpbmVkKw&ntb=1
Trained 是什么意思_Trained 的翻译_音标_读音_用法_例句_爱词霸在 …
爱词霸权威在线词典,为您提供Trained 的中文意思,Trained 的用法讲解,Trained 的读音,Trained 的同义词,Trained 的反义词,Trained 的例句等英语服务。
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › ...-trained_transformer
Generative pre-trained transformer - Wikipedia
generative pre-trained transformers to generate text, such as Gemini, DeepSeek and Claude. GPTs are primarily used to generate text, but can be trained to generate
www.reddit.com Reddit
reddit.com › r › fight...es_a_street_fighter ›
Trained fighter humiliates a street fighter
...
github.com GitHub
github.com › jakearchibald › trained-to-thrill
jakearchibald/trained-to-thrill
Trains! Yey! (⭐ 330)
www.latent.space HackerNews
latent.space › p › reza-shabani#details
Replit's new Code LLM: Open Source, 77% smaller than Codex, trained in 1 week
Points: 891 | Comments: 220 | Author: swyx
arxiv.org arXiv
arxiv.org › abs › 2512.13687v1
Towards Scalable Pre-training of Visual Tokenizers for Generation
The quality of the latent space in visual tokenizers (e.g., VAEs) is crucial for modern generative models. However, the standard reconstruction-based training paradigm produces a latent space that is ...
www.bing.com Bing
bing.com › ck › a?!&am...TQlQkElQkE&ntb=1
盲人是什么意思_盲人的翻译_音标_读音_用法_例句_爱词霸在线词典
金山词霸致力于为用户提供高效、精准的在线翻译服务,支持中、英、日、韩、德、法等177种语言在线翻译,涵盖即时免费的AI智能翻译、英语翻译、俄...
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › Train
Train - Wikipedia
A train (from Old French trahiner, from Latin trahere, "to pull, to draw") is a series of connected vehicles that run along a railway track and transport
www.reddit.com Reddit
reddit.com › r › Whatc...a_firing_range_with ›
Pointing a gun at a friend in a firing range, with trained staff, just for a selfie. WCGW?
...
github.com GitHub
github.com › loujie0822 › Pre-trained-Models
loujie0822/Pre-trained-Models
预训练语言模型综述 (⭐ 548)
arxiv.org arXiv
arxiv.org › abs › 2005.07202v3
Pre-training technique to localize medical BERT and enhance biomedical BERT
Pre-training large-scale neural language models on raw texts has made a significant contribution to improving transfer learning in natural language processing (NLP). With the introduction of transform...
www.bing.com Bing
bing.com › ck › a?!&am...hpbWF0ZWx5&ntb=1
approximately是什么意思_approximately的翻译_音标_读音_用法_例 …
金山词霸致力于为用户提供高效、精准的在线翻译服务,支持中、英、日、韩、德、法等177种语言在线翻译,涵盖即时免费的AI智能翻译、英语翻译、俄...
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › Train_Dreams_%28film%29
Train Dreams (film) - Wikipedia
Train Dreams is a 2025 American period drama film directed by Clint Bentley, who co-wrote the screenplay with Greg Kwedar, based on the 2011 novella by
www.reddit.com Reddit
reddit.com › r › Unexp...nts › rrgf0q › training ›
Training
...
github.com GitHub
github.com › neonbjb › tortoise-tts
neonbjb/tortoise-tts
A multi-voice TTS system trained with an emphasis on quality (⭐ 14817)
arxiv.org arXiv
arxiv.org › abs › 2109.04912v1
ReasonBERT: Pre-trained to Reason with Distant Supervision
We present ReasonBert, a pre-training method that augments language models with the ability to reason over long-range relations and multiple, possibly hybrid contexts. Unlike existing pre-training met...
www.bing.com Bing
bing.com › ck › a?!&am...WZpY2FsbHk&ntb=1
specifically是什么意思_specifically的翻译_音标_读音_用法_例句_爱词 …
Students are placed in small groups with counselors trained seniors on campus; they have access to cultural and ethnic affinity (联系) groups, tutoring centers and also have a summer orientation …
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › This_Train
This Train - Wikipedia
"This Train", also known as "This Train Is Bound for Glory", is a traditional African-American gospel song first recorded in 1922. Although its origins
www.reddit.com Reddit
reddit.com › r › mildl...to_only_come_inside ›
Wife accidentally trained dog to only come inside if you sing to him.
She started singing to him as a joke instead of just calling him. Now he only responds to the song and ignores me otherwise. I’ll be outside while my neighbors BBQ and need to sing this stupid song....
github.com GitHub
github.com › tesseract-ocr › tessdata
tesseract-ocr/tessdata
Trained models with fast variant of the "best" LSTM models + legacy models (⭐ 7406)
arxiv.org arXiv
arxiv.org › abs › 1103.0540v1
An Algorithm for Repairing Low-Quality Video Enhancement Techniques Based on Trained Filter
Multifarious image enhancement algorithms have been used in different applications. Still, some algorithms or modules are imperfect for practical use. When the image enhancement modules have been fixe...
www.bing.com Bing
bing.com › ck › a?!&am...1lZGljaW5l&ntb=1
general medicine是什么意思_general medicine的翻译_音标_读音_用 …
GPs are trained in general medicine but are not specialists in any particular subject. 这些医生都是经过医药常识训练而非专业领域的专家. 互联网 英英释义
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › ChatGPT
ChatGPT - Wikipedia
developed by OpenAI. It was released in November 2022. It uses generative pre-trained transformers (GPTs), such as GPT-5.2, to generate text, speech, and images
www.reddit.com Reddit
reddit.com › r › iamat...h_someone_untrained ›
Trained MMA fighter rolls with someone untrained, ignores his tap outs, chokes him unconscious and then mocks him
...
github.com GitHub
github.com › snakers4 › silero-vad
snakers4/silero-vad
Silero VAD: pre-trained enterprise-grade Voice Activity Detector (⭐ 8348)
arxiv.org arXiv
arxiv.org › abs › 2004.06165v5
Oscar: Object-Semantics Aligned Pre-training for Vision-Language Tasks
Large-scale pre-training methods of learning cross-modal representations on image-text pairs are becoming popular for vision-language tasks. While existing methods simply concatenate image region feat...
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › Trained_band
Trained band - Wikipedia
A nineteenth-century dictionary says, under "Train": train-band, i.e. train'd band, a band of trained men, Cowper, John Gilpin, st. I, and used by Dryden
www.reddit.com Reddit
reddit.com › r › fight...her_one_was_trained ›
He didn’t know the other one was trained.
...
github.com GitHub
github.com › AI4Finance-Foundation › FinGPT
AI4Finance-Foundation/FinGPT
FinGPT: Open-Source Financial Large Language Models! Revolutionize 🔥 We release the trained model on HuggingFace. (⭐ 18730)
arxiv.org arXiv
arxiv.org › abs › 2405.14365v1
JiuZhang3.0: Efficiently Improving Mathematical Reasoning by Training Small Data Synthesis Models
Mathematical reasoning is an important capability of large language models~(LLMs) for real-world applications. To enhance this capability, existing work either collects large-scale math-related texts ...
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › A_Train
A Train - Wikipedia
Look up Train or train in Wiktionary, the free dictionary. A Train may refer to: The A (New York City Subway service) A Division (New York City Subway)
www.reddit.com Reddit
reddit.com › r › ChatG...hine_lets_see_proof ›
You're now training a war machine. Let's see proof of cancellation.
Yeah, we're all in the death business now that OpenAI has succumbed to the corrupt Department of War. Let's see proof of your cancellation boys and girls. ...
github.com GitHub
github.com › tesseract-ocr › tessdata_best
tesseract-ocr/tessdata_best
Best (most accurate) trained LSTM models. (⭐ 1508)
arxiv.org arXiv
arxiv.org › abs › 2302.07142v2
Semantic Importance-Aware Communications Using Pre-trained Language Models
This letter proposes a semantic importance-aware communication (SIAC) scheme using pre-trained language models (e.g., ChatGPT, BERT, etc.). Specifically, we propose a cross-layer design with a pre-tra...
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › Train_Train
Train Train - Wikipedia
Train Train may refer to: Train+Train, a 2000 Japanese light novel series Train*Train, a manga by Eiki Eiki "Train, Train", a song by Blackfoot "Train
www.reddit.com Reddit
reddit.com › r › TikTo..._the_25yearold_they ›
They asked her to train the 25-year-old they promoted over her.
...
github.com GitHub
github.com › openvinotoolkit › open_model_zoo
openvinotoolkit/open_model_zoo
Pre-trained Deep Learning models and demos (high quality and extremely fast) (⭐ 4361)
arxiv.org arXiv
arxiv.org › abs › 1909.03564v2
Transfer Learning Robustness in Multi-Class Categorization by Fine-Tuning Pre-Trained Contextualized Language Models
This study compares the effectiveness and robustness of multi-class categorization of Amazon product data using transfer learning on pre-trained contextualized language models. Specifically, we fine-t...
en.wikipedia.org Wikipedia
en.wikipedia.org › wiki › The_Train
The Train - Wikipedia
The Train may refer to: The Train (1964 film), an American war film starring Burt Lancaster and Paul Scofield The Train (1970 film), an Indian Hindi suspense
www.reddit.com Reddit
reddit.com › r › nextf..._recalls_his_boxing ›
Waiter being attacked suddenly recalls his boxing training
...
github.com GitHub
github.com › zai-org › GLM-130B
zai-org/GLM-130B
GLM-130B: An Open Bilingual Pre-Trained Model (ICLR 2023) (⭐ 7669)
arxiv.org arXiv
arxiv.org › abs › 2211.05187v1
Training a Vision Transformer from scratch in less than 24 hours with 1 GPU
Transformers have become central to recent advances in computer vision. However, training a vision Transformer (ViT) model from scratch can be resource intensive and time consuming. In this paper, we ...