Moozonian

About 208 results
AI Overview
Generating...
Sponsored • AdSense Integration Active
arxiv.org
arxiv.org › abs › 2404.12407v1
TV100: A TV Series Dataset that Pre-Trained CLIP Has Not Seen
The era of pre-trained models has ushered in a wealth of new insights for the machine learning community. Among the myriad of questions that arise, one of paramount importance is: 'Do pre-trained mode...
research.google
research.google › blog › h...underwater-mysteries
How AI trained on birds is surfacing underwater mysteries
Discover how Google researchers repurposed AI models trained on bird songs to detect whale calls and track marine health. Explore the future of bioacoustic monitoring.
arxiv.org
arxiv.org › abs › 2602.08818v1
FlexMoRE: A Flexible Mixture of Rank-heterogeneous Experts for Efficient Federatedly-trained Large Language Models
Recent advances in mixture-of-experts architectures have shown that individual experts models can be trained federatedly, i.e., in isolation from other experts by using a common base model to facilita...
Sponsored • AdSense Integration Active
arxiv.org
arxiv.org › abs › 2312.09052v1
Applying Pre-Trained Deep-Learning Model on Wrist Angel Data -- An Analysis Plan
We aim to investigate if we can improve predictions of stress caused by OCD symptoms using pre-trained models, and present our statistical analysis plan in this paper. With the methods presented in th...
arxiv.org
arxiv.org › abs › 2203.09127v3
ERNIE-GeoL: A Geography-and-Language Pre-trained Model and its Applications in Baidu Maps
Pre-trained models (PTMs) have become a fundamental backbone for downstream tasks in natural language processing and computer vision. Despite initial gains that were obtained by applying generic PTMs ...
arxiv.org
arxiv.org › abs › 2508.10160v1
Pre-trained Transformer-models using chronic invasive electrophysiology for symptom decoding without patient-individual training
Neural decoding of pathological and physiological states can enable patient-individualized closed-loop neuromodulation therapy. Recent advances in pre-trained large-scale foundation models offer the p...
arxiv.org
arxiv.org › abs › 2307.04765v1
Performance Comparison of Pre-trained Models for Speech-to-Text in Turkish: Whisper-Small and Wav2Vec2-XLS-R-300M
In this study, the performances of the Whisper-Small and Wav2Vec2-XLS-R-300M models which are two pre-trained multilingual models for speech to text were examined for the Turkish language. Mozilla Com...
www.reddit.com
reddit.com › r › StarW..._trained_force_user ›
[Theory] Jar Jar Binks was a trained Force user, knowing Sith collaborator, and will play a central role in The Force Awakens
Here I will seek to establish that Jar Jar Binks, far from being simply the bumbling idiot he portrays himself as, is in fact a highly skilled force user in terms of martial ability and mind control. ...
arxiv.org
arxiv.org › abs › 2511.01066v2
HPLT 3.0: Very Large-Scale Multilingual Resources for LLM and MT. Mono- and Bi-lingual Data, Multilingual Evaluation, and Pre-Trained Models
We present an ongoing initiative to provide open, very large, high-quality, and richly annotated textual datasets for almost 200 languages. At 30 trillion tokens, this is likely the largest generally ...
en.wikipedia.org
en.wikipedia.org › wiki › ...-trained_transformer
Generative pre-trained transformer - Wikipedia
(November 7, 2019). "OpenAI has published the text-generating AI it said was too dangerous to share". The Verge. Archived from the original on June 11, 2020
arxiv.org
arxiv.org › abs › 1906.08473v1
Data Cleansing for Models Trained with SGD
Data cleansing is a typical approach used to improve the accuracy of machine learning models, which, however, requires extensive domain knowledge to identify the influential instances that affect the ...
www.washingtonpost.com
washingtonpost.com...5d47e90ca_story.html
Mali coup leader was trained by US military
Points: 1 | Comments: 0 | Author: lilbaine
www.reddit.com
reddit.com › r › Local...r_v4_novel_on_log_n ›
Wave Field Transformer V4 — Novel O(n log n) attention architecture, 825M model trained from scratch on 1.33B tokens. Weights on HuggingFace.
Hey everyone, I've been building a new transformer architecture from scratch called Wave Field Transformer. Instead of standard O(n²) dot-product attention, it uses FFT-based wave interferenc...
nwn.blogs.com
nwn.blogs.com › nwn › 2019...h-robert-thomas.html
Music Inspired by “Gödel, Escher, Bach” Created by an AI Trained on Bach
Points: 8 | Comments: 1 | Author: Kroeler
twitter.com
twitter.com › maximelabonne › status › 1762130357355184374
Is Mistral Large Trained on Open AI Synthetic Data?
Points: 2 | Comments: 0 | Author: varunvummadi
arxiv.org
arxiv.org › abs › 2501.08333v3
DAViD: Modeling Dynamic Affordance of 3D Objects Using Pre-trained Video Diffusion Models
Modeling how humans interact with objects is crucial for AI to effectively assist or mimic human behaviors. Existing studies for learning such ability primarily focus on static human-object interactio...
www.bbc.com
bbc.com › news › world-europe-58552651
Cows toilet trained to reduce greenhouse gas emissions
Researchers say toilet training cows could lead to a large reduction in ammonia emissions.
arxiv.org
arxiv.org › abs › 2106.04935v1
Neural Supervised Domain Adaptation by Augmenting Pre-trained Models with Random Units
Neural Transfer Learning (TL) is becoming ubiquitous in Natural Language Processing (NLP), thanks to its high performance on many tasks, especially in low-resourced scenarios. Notably, TL is widely us...
arxiv.org
arxiv.org › abs › 2309.07117v3
PILOT: A Pre-Trained Model-Based Continual Learning Toolbox
While traditional machine learning can effectively tackle a wide range of problems, it primarily operates within a closed-world setting, which presents limitations when dealing with streaming data. As...