Are you interested in Data Science? Looking for a reliable source of information on AI, Big Data, Machine Learning, Statistics, and general Math? Look no further than the Data Science channel by ODS.ai 🦜 on Telegram! This channel is the first of its kind, dedicated to providing subscribers with all the latest technical and popular news and updates in the world of Data Science. Whether you are a seasoned professional or just starting out in the field, this channel has something for everyone. From tutorials and guides to real-world applications and case studies, you will find a wealth of knowledge at your fingertips. The editors of this channel are experts in the field and are always available to answer your questions and provide guidance. To connect with them, simply reach out to @haarrp on Telegram. Don't miss out on the opportunity to stay informed and up-to-date on all things Data Science - join the Data Science channel by ODS.ai 🦜 today!
13 Jan, 13:22
12 Jan, 05:30
11 Jan, 20:16
26 Dec, 16:20
# Clone repo
git clone https://github.com/Johanan528/DepthLab.git
cd DepthLab
# Create conda env
conda env create -f environment.yaml
conda activate DepthLab
# Run inference
cd scripts
bash infer.sh
20 Dec, 18:45
20 Dec, 17:19
18 Dec, 12:27
16 Dec, 21:05
16 Dec, 19:35
12 Dec, 11:36
10 Dec, 10:01
05 Dec, 15:58
05 Dec, 12:30
05 Dec, 11:01
02 Dec, 21:50
02 Dec, 18:46
30 Nov, 13:08
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
torch.set_default_device("cuda")
model = AutoModelForCausalLM.from_pretrained("PrimeIntellect/INTELLECT-1")
tokenizer = AutoTokenizer.from_pretrained("PrimeIntellect/INTELLECT-1")
input_text = "%prompt%"
input_ids = tokenizer.encode(input_text, return_tensors="pt")
output_ids = model.generate(input_ids, max_length=50, num_return_sequences=1)
output_text = tokenizer.decode(output_ids[0], skip_special_tokens=True)
print(output_text)
29 Nov, 12:00
29 Nov, 11:02
29 Nov, 10:32
# Create new conda env
conda create -n myenv -c conda-forge -c legate cupynumeric
# Test via example from repo
$ legate examples/black_scholes.py
Running black scholes on 10K options...
Elapsed Time: 129.017 ms
27 Nov, 09:00
27 Nov, 07:00
21 Nov, 17:01
21 Nov, 14:56
# official online demo
DEMO_PORT=15432 \
python app/app_sana.py \
--config=configs/sana_config/1024ms/Sana_1600M_img1024.yaml \
--model_path=hf://Efficient-Large-Model/Sana_1600M_1024px/checkpoints/Sana_1600M_1024px.pth
20 Nov, 18:27
14 Nov, 20:30
06 Nov, 12:30
06 Nov, 11:00
05 Nov, 10:23
03 Nov, 16:54
01 Nov, 14:06
01 Nov, 12:06
31 Oct, 16:22
26 Oct, 16:27
24 Oct, 14:01
24 Oct, 12:30
24 Oct, 11:06
16 Oct, 14:02
16 Oct, 13:00
15 Oct, 18:31
15 Oct, 16:00
03 Oct, 22:24
03 Oct, 16:30
23 Sep, 14:01
23 Sep, 12:01
15 Sep, 15:07
--aggressive_offload
, но генерация будет выполняться очень, очень, очень медленно.timestep to start inserting ID
. Этот параметр управляет там, в какой момент ID (лицо с входного изображения) будет вставлен в DIT (значение 0 - ID будет вставляться с первого шага). Градация: чем меньше значение - тем более похожим на исходный портрет будет результат. Рекомендованное значение для фотореализма - 4.true CFG scale
. Параметр, модулирующий CFG-значение. Исходный процесс CFG метода PuLID, который требовал удвоенного количества этапов вывода, преобразован в шкалу управления чтобы имитировать истинный процесс CFG с половиной шагов инференса.# clone PuLID repo
git clone https://github.com/ToTheBeginning/PuLID.git
cd PuLID
# create conda env
conda create --name pulid python=3.10
# activate env
conda activate pulid
# Install dependent packages
# 1. For SDXL or Flux-bf16, install the following
pip install -r requirements.txt
# 2. For Flux-fp8, install this
pip install -r requirements_fp8.txt
# Run Gradio UI
python app.py
04 Sep, 17:05
02 Sep, 19:00
02 Sep, 15:33
02 Sep, 10:17