Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how can i load the fastsam model in float16 ?? #233

Open
dahwin opened this issue Jun 28, 2024 · 0 comments
Open

how can i load the fastsam model in float16 ?? #233

dahwin opened this issue Jun 28, 2024 · 0 comments

Comments

@dahwin
Copy link

dahwin commented Jun 28, 2024

**%cd /content/FastSAM
import torch
from PIL import Image
import matplotlib.pyplot as plt
from fastsam import FastSAM, FastSAMPrompt

Set up parameters

model_path = "/content/FastSAM.pt"
img_path = "/content/grid.png"
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")

Load model

model = FastSAM(model_path)

Load and process image

input_image = Image.open(img_path).convert("RGB")

Run inference

everything_results = model(
input_image,
device=device,
retina_masks=True,
imgsz=1024,
conf=0.4,
iou=0.9
)

Process results

prompt_process = FastSAMPrompt(input_image, everything_results, device=device)
ann = prompt_process.everything_prompt()

Plot results

result_image = prompt_process.plot(
annotations=ann,
withContours=True,
better_quality=True,
output_path='dahyun.png'
)**

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant