Sunday, July 13

Tag: Reddit

Image generator IA, free and cost version
News Feed, Reddit

Image generator IA, free and cost version

Hello Good Night I would like to know who it´s the best IA to generate anime images, or in general images of all kinds but i want to make my onw scenes and characters in action (for example a woman holding a big rubber mallet about to smash an alarm clock, when she is about to wake up ), or for example crossing images and generate the one fusion like this girl(Kouko), with green dress, holding the bugs bunny mallet and smashing the clock, with the wooden mallet , with the main character of the anime inside them https://preview.redd.it/dy5mu852yp2e1.jpg?width=600&format=pjpg&auto=webp&s=3e3abb9298df28ed4af2e328071791c3d6dd7504 this girl(Kouko), with green dress, holding the bugs bunny mallet and smashing the clock, with the wooden mallet , with the main character of the a...
Comparing Precision Knowledge Editing with existing machine unlearning methods
News Feed, Reddit

Comparing Precision Knowledge Editing with existing machine unlearning methods

I've been working on a project called PKE (Precision Knowledge Editing), an open-source method to improve the safety of LLMs by reducing toxic content generation without impacting their general performance. It works by identifying "toxic hotspots" in the model using neuron weight tracking and activation pathway tracing and modifying them through a custom loss function. There's lots of current Machine unlearning techniques that can make LLMs safer right now like: Exact Unlearning: This method involves retraining the model from scratch after removing the undesired data. While it ensures complete removal of the data's influence, it is computationally expensive and time-consuming, especially for large models. Approximate Unlearning: Fine-Tuning: adjusting the model using the remaining data t...
ADOPT: A Modified Adam Optimizer with Guaranteed Convergence for Any Beta-2 Value
News Feed, Reddit

ADOPT: A Modified Adam Optimizer with Guaranteed Convergence for Any Beta-2 Value

A new modification to Adam called ADOPT enables optimal convergence rates regardless of the β₂ parameter choice. The key insight is adding a simple term to Adam's update rule that compensates for potential convergence issues when β₂ is set suboptimally. Technical details: - ADOPT modifies Adam's update rule by introducing an additional term proportional to (1-β₂) - Theoretical analysis proves O(1/√T) convergence rate for any β₂ ∈ (0,1) - Works for both convex and non-convex optimization - Maintains Adam's practical benefits while improving theoretical guarantees - Requires no additional hyperparameter tuning Key results: - Matches optimal convergence rates of SGD for smooth non-convex optimization - Empirically performs similarly or better than Adam across tested scenarios - Provides more ...
The AI Report