LoRAs is trained on screenshots of X-Men Evolution (2000-2003) taken from the website fancaps.net
v1.0 BETA - (Only Logan) without Wolverine suits! The model can sometimes create black borders, this will be fixed in the full version! LoRA is trained on 230 screenshots
v1.0 - There are no more black borders, a little more sharpness and fewer JPEG artifacts. There are Wolverine first and second suit. LoRA is trained on 480 screenshots
v1.1 - The main difference from v1.0 is that I split the data and created several LoRAs, the previous version was something like all in one, but it seems to me it was not a very good idea. I changed the captions a bit and made almost no changes to the images.
Logan - (Only Logan) Only Logan without the Wolverine suits
Logan Lite - (Only Logan) Lite - Only Logan in dark blue t-shirt and brown jacket without cowboy hat (The dataset has been reduced by almost half)
Suit 1 - (First Suit) Only Wolverine in first suit, LoRA was not trained on images of Wolverine without a mask!
Suit 1+U - (First Suit+Unmasked) Only Wolverine in first suit + Images of Wolverine without a mask
Suit 1+L - (First Suit+Logan) Only Wolverine in first suit + Images of Wolverine without mask and images of Logan to enhance the creation of Wolverine without Mask
Suit 2 - (Second Suit) Only Wolverine in the second suit, without images of Logan
Preview generated without using ADetailer & Hires. fix
Recommended LoRA Strength (Weight): 0.8-0.9 - 0.7-0.8 BETA
Repeats: 1
Epoch: 15
Image Count: 158
Network Rank (Dimension): 16
Network Alpha: 8
Batch size: 1
Clip skip: 1
Training steps: 2370
Resolution: 1024x768
Train Resolution: 1024,1024
Model: noobaiXLNAIXL_epsilonPred10Version
LR Scheduler: cosine_with_restarts
Optimizer: AdamW8bit
Learning rate: 0,0005
Text Encoder learning rate: 0,00005
Unet learning rate: 0,0005
Captioning: WD14 Captioning (SmilingWolf/wd-v1-4-moat-tagger-v2) + Edited
Recommended LoRA Strength (Weight): 0.8-0.9
Go ahead and upload yours!
Your query returned no results β please try removing some filters or trying a different term.