Skip to content

Commit 48c73c4

Browse files
committed
wip
1 parent 3e28689 commit 48c73c4

File tree

1 file changed

+10
-10
lines changed

1 file changed

+10
-10
lines changed

_posts/2024-12-20-You-can-do-AI-with-cpp.markdown

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -30,10 +30,9 @@ most representative AI libraries available in Conan Center Index.
3030

3131
### An Overview of Some AI and ML Libraries Available in Conan Center
3232

33-
Below are some notable libraries you can easily integrate with your C++ projects through
34-
Conan Center. These libraries range from running large language models locally to
35-
optimizing model inference on edge devices or using specialized toolkits for tasks like
36-
computer vision and numerical optimization.
33+
Below are some notable libraries available in Conan Center Index. These libraries range
34+
from running large language models locally to optimizing model inference on edge devices
35+
or using specialized toolkits for tasks like computer vision and numerical optimization.
3736

3837
#### LLaMA.cpp
3938

@@ -45,12 +44,13 @@ models like [LLaMA 3](https://huggingface.co/models?search=llama),
4544
as well as multimodal models like [LLaVA](https://github.com/haotian-liu/LLaVA).
4645

4746
One of the most interesting aspects of this library is that it includes a collection of
48-
CLI tools as examples, making it easy to run your own LLMs straight out of the box. To
49-
install the library with Conan, ensure that you enable building the examples and activate
50-
the network options (which require `libcurl`). Then, use a [Conan
51-
deployer](https://docs.conan.io/2/reference/extensions/deployers.html) to move the
52-
installed files from the Conan cache to the user space. To accomplish this, simply run the
53-
following command:
47+
CLI tools as examples, making it easy to run your own LLMs straight out of the box.
48+
49+
Let's try one of those tools. First, install the library with Conan and ensure that you
50+
enable building the examples and activate the network options (which require `libcurl`).
51+
Then, use a [Conan deployer](https://docs.conan.io/2/reference/extensions/deployers.html)
52+
to move the installed files from the Conan cache to the user space. To accomplish this,
53+
simply run the following command:
5454

5555
```shell
5656
# Install llama-cpp using Conan and deploy to the local folder

0 commit comments

Comments
 (0)