Want to Contribute to us or want to have 15k+ Audience read your Article ? Or Just want to make a strong Backlink?

Unboxing Ragna: Getting hands on and making it to work with Amazon Bedrock

[*]

I’m all the time on the look out for fascinating new tasks to take a look at, and this week I got here throughout Ragna, an open supply Retrieval Augmented Technology RAG orchestration framework. It’s a new mission with a dedicated and energetic group, so I needed to seek out out extra about this mission.

On this submit I’ll share my expertise of getting arms on with Ragna, after which additionally share how I used to be in a position to combine it with Amazon Bedrock’s basis fashions, particularly Anthropic Claude. That is solely a fast submit, and simply meant to indicate you how one can get began your self, however contains every little thing that you must stand up and working.

Pre-requisites

If you wish to comply with alongside, then you will want entry to an AWS account in a area the place Amazon Bedrock is offered (I’m going to be utilizing Frankfurt’s eu-central-1), in addition to an API key for one of many out of the field suppliers (I will probably be utilizing OpenAI).



Native set up

Getting up and working is sweet and simple, and the mission have achieved a terrific job in simplifying the job. The installation guide is easy.

Observe! I used to be working Python 3.10.11 on my MacOS[*]

python -m venv ragna
supply ragna/bin/activate
cd ragna
pip set up --upgrade pip
pip set up 'ragna[all]'
Enter fullscreen mode

Exit fullscreen mode

Lets kick the tyres. Good to see the doc has some tutorials

Once we run the command “ragna ui” we get the next:[*]

The configuration file ./ragna.toml doesn't exist.
If you do not have a configuration file but, run ragna init to generate one.
Enter fullscreen mode

Exit fullscreen mode

Operating “ragna-init” lets us undergo the ragna setup wizard.[*]

ragna init 

        Welcome to the Ragna config creation wizard!

I am going to provide help to create a configuration file to make use of with ragna.
As a result of great amount of parameters, I sadly cannot cowl every little thing. If you wish to customise every little thing, please take a look on the documentation as a substitute.

? Which of the next statements describes greatest what you need to do? (Use arrow keys)
 Β» I need to attempt Ragna with out worrying about any further dependencies or setup.
   I need to attempt Ragna and its builtin supply storages and assistants, which probably require further dependencies or setup.
   I've used Ragna earlier than and need to customise the commonest parameters.
Enter fullscreen mode

Exit fullscreen mode

It is going to ask you to offer inputs for 3 questions:

  • Which of the next statements describes greatest what you need to do?
  • Which supply storages do you need to use?
  • Which assistants do you need to use?

At this level it’s in all probability price studying the docs to know what these imply within the context of this utility. They’ve provided a page that gives the small print of what’s occurring if you run the “ragna init”,[*]

ragna ui
/Customers/ricsue/Initiatives/GenAI/oss-ragna/ragna/ragna/__init__.py:6: UserWarning: ragna was not correctly put in!
  warnings.warn("ragna was not correctly put in!")
INFO:   RagnaDemoAuthentication: You may log in with any username and an identical password.
INFO:     Began server course of [60960]
INFO:     Ready for utility startup.
INFO:     Software startup full.
INFO:     Uvicorn working on http://127.0.0.1:31476 (Press CTRL+C to give up)
INFO:     127.0.0.1:62877 - "GET / HTTP/1.1" 200 OK
Launching server at http://localhost:31477
INFO:     127.0.0.1:62923 - "GET / HTTP/1.1" 200 OK
Enter fullscreen mode

Exit fullscreen mode

I used to be a bit anxious that this was not going to work, however launching a browser and opening up the http://localhost:31477 did carry up an internet web page and prompted me to log in. From the output, it appears like I can simply use any enter to log in.

After a couple of seconds, I used to be greeted with the Ragna UI.

Once we click on on the NEW CHAT button, we will see the choices that enable us to alter each the vector retailer used for storing the paperwork we add, in addition to the AI assistant we need to use.

new chat dialog

I like that it permits you to configure “dummy” configurations to kick the tyres of how this instrument works.

dummy demo

I configure the DemoSourStorage after which add an area pdf I’ve (one about Apache Airflow) and I can then begin to try to use the instrument to work together with this doc. As I’ve not truly configured a foundational mannequin, this simply offers me again dummy data. I like how that is introduced, with a hyperlink to the supply , and details about the mannequin used (on the high).

Lets give up (^C from the working course of) from this and now hook in an actual mannequin.

Configuring this with OpenAI

From the terminal I first export my OPENAI key[*]

export OPENAI_API_KEY="sk-7mixxxxxxxxxxxxxxxx"
Enter fullscreen mode

Exit fullscreen mode

I now take a look at the ragna.toml configuration file that was created by the “ragna init” command, as I have to replace this to incorporate the OpenAI assistant.

I replace it utilizing data from the docs here[*]

[core]
queue_url = "reminiscence"
doc = "ragna.core.LocalDocument"
source_storages = ["ragna.source_storages.Chroma", "ragna.source_storages.RagnaDemoSourceStorage", "ragna.source_storages.LanceDB"]
assistants = ["ragna.assistants.Gpt4"]
Enter fullscreen mode

Exit fullscreen mode

After saving the file, I re-run ragna in UI mode and take a look at once more. Once I now begin a brand new chat, I can see that OpenAI/gpt4 is my new default assistant.

After importing the identical doc, I can now being interacting with it.

After asking it some primary questions, I’m not getting very far and it isn’t working as anticipated.

not working

There are not any errors, so questioning what could be the difficulty right here. I think it’s in all probability the defaults I used when initialising Ragna. I can reconfigure by re-running “ragna init”.

Trace! When you’re utilizing the ragna init command, you should use the up and down arrows to decide on, after which the house bar to pick out. You’ll discover that if you press the house bar, the choice turns into highlighted/chosen.[*]

? Which of the next statements describes greatest what you need to do? I've used Ragna earlier than and need to customise the commonest parameters.

ragna has the next elements builtin. Choose those that you just need to use. If the necessities of a specific element are usually not met, I am going to present you directions tips on how to meet them later.

? Which supply storages do you need to use? achieved (3 choices)

? Which assistants do you need to use? achieved (3 choices)

? The place ought to native information be saved? /Customers/ricsue/.cache/ragna

? Ragna internally makes use of a activity queue to carry out the RAG workflow. What sort of queue do you need to use? file system: The native file system is used to construct the queue. Beginning a ragna employee is required. Requir
es the employee to be run on the identical machine as the principle thread.

? The place do you need to retailer the queue information? /Customers/ricsue/.cache/ragna/queue

? At what URL would you like the ragna REST API to be served? http://127.0.0.1:31476

? Do you need to use a SQL database to persist the chats between runs? Sure

? What's the URL of the database? sqlite:////Customers/ricsue/.cache/ragna/ragna.db

? At what URL would you like the ragna internet UI to be served? http://127.0.0.1:31477

The output path /Customers/ricsue/Initiatives/GenAI/oss-ragna/ragna/ragna.toml already exists and also you did not move the --force flag to overwrite it. 

? What do you need to do? Overwrite the prevailing file.

And with that we're achieved πŸŽ‰ I am writing the configuration file to /Customers/ricsue/Initiatives/GenAI/oss-ragna/ragna/ragna.toml.
Enter fullscreen mode

Exit fullscreen mode

This time I used to be requested much more questions, after deciding on ” I’ve used Ragna earlier than and need to customise the commonest parameters.” from the primary query. Once I restart the UI and start a brand new chat, I can see I’ve some extra choices.

ragna additional options

Once I add the doc it takes loads longer this time to course of it. I can see within the logs it’s performing some extra stuff[*]

INFO:     127.0.0.1:50231 - "OPTIONS /doc?title=amazon-mwaa-mg.pdf HTTP/1.1" 200 OK
INFO:     127.0.0.1:50231 - "GET /doc?title=amazon-mwaa-mg.pdf HTTP/1.1" 200 OK
INFO:     127.0.0.1:50231 - "POST /doc HTTP/1.1" 200 OK
INFO:     127.0.0.1:50232 - "POST /chats HTTP/1.1" 200 OK
INFO:huey:Executing ragna.core._queue._Task: 713407ab-d3f0-4df1-84a1-e4c3bf14c6e8
/Customers/ricsue/.cache/chroma/onnx_models/all-MiniLM-L6-v2/onnx.tar.gz: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 79.3M/79.3M [00:27<00:00, 3.02MiB/s]
INFO:huey:ragna.core._queue._Task: 713407ab-d3f0-4df1-84a1-e4c3bf14c6e8 executed in 30.799s
INFO:     127.0.0.1:50232 - "POST /chats/febb8ba7-fccf-4380-aba9-502134fd2050/put together HTTP/1.1" 200 OK
INFO:     127.0.0.1:50232 - "GET /chats HTTP/1.1" 200 OK
Enter fullscreen mode

Exit fullscreen mode

And now we’re in enterprise. Now it’s working nice.

ragna demo working with openai



AWS Cloud9 set up

As common readers of this weblog will know, I’m a giant fan of utilizing AWS Cloud9 to make it tremendous straightforward to check out new tasks like this. I needed to ensure that I may get the identical expertise when working on my atmosphere.

After making a contemporary Cloud9 occasion (working Ubuntu 22.04 LTS), I used to be able to go.

Observe! I additionally elevated my disk house utilizing this to 30GB

I setup a brand new digital atmosphere, clone the mission, after which set up all of the dependencies. This time nevertheless, I’m going to make use of Conda to arrange the digital atmosphere. First we’ve to put in Conda, which we will do simply through the use of the code supplied from the Miniconda webpage here[*]

mkdir -p ~/miniconda3
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O ~/miniconda3/miniconda.sh
bash ~/miniconda3/miniconda.sh -b -u -p ~/miniconda3
rm -rf ~/miniconda3/miniconda.sh
~/miniconda3/bin/conda init bash
Enter fullscreen mode

Exit fullscreen mode

We will now arrange our Ragna atmosphere, by first testing the supply code after which creating a brand new atmosphere for working Ragna utilizing Conda. Keep in mind, be sure you have began a brand new shell, in any other case conda won’t be out there[*]

git clone https://github.com/Quansight/ragna.git
cd ragna
conda env create --file environment-dev.yml
conda activate ragna-dev
pip set up chromadb
pip set up 'ragna[all]'
cd ragna
Enter fullscreen mode

Exit fullscreen mode

Now we cam arrange Ragna as we did for our native setup.

After working “ragna init” and answering the configuration selections, the instrument generates our configuration file (referred to as ragna.toml) which we will have a look at:

Configuration Selections – After working ragna init, I chosen “I’ve used Ragna earlier than and need to customise the commonest parameters.” as the primary possibility. “Chroma” for the subsequent possibility. I chosen each “OpenAI/gpt-4” and “OpenAI/gpt-3.5-turbo-16k” for the subsequent. I depart the default for the subsequent choices “/dwelling/ec2-user/.cache/ragna”. I choose “File system” for the subsequent. I depart the default possibility for the subsequent ” /dwelling/ec2-user/.cache/ragna/queue”, in addition to the next possibility for REST API “http://127.0.0.1:31476“. I choose Y to utilizing SQL to persist chats, after which use the default URL for the database “sqlite:////dwelling/ec2-user/.cache/ragna/ragna.db”. For the ultimate possibility, it is best to settle for the default port quantity.

local_cache_root = "/dwelling/ec2-user/.cache/ragna"

[core]
queue_url = "/dwelling/ec2-user/.cache/ragna/queue"
doc = "ragna.core.LocalDocument"
source_storages = ["ragna.source_storages.Chroma", "ragna.source_storages.RagnaDemoSourceStorage", "ragna.source_storages.LanceDB"]
assistants = ["ragna.assistants.RagnaDemoAssistant", "ragna.assistants.Gpt4", "ragna.assistants.Gpt35Turbo16k"]

[api]
url = "http://127.0.0.1:31476"
origins = ["http://127.0.0.1:31477"]
database_url = "sqlite:////dwelling/ec2-user/.cache/ragna/ragna.db"
authentication = "ragna.core.RagnaDemoAuthentication"

[ui]
url = "http://127.0.0.1: 31477"
origins = ["http://127.0.0.1:31477"]
Enter fullscreen mode

Exit fullscreen mode

Checking the configuration

Ragna supplies some good cli instruments that will help you work with it. One of many instruments is “ragna test” which lets you validate a given Ragna configuration file (ragna.toml) to ensure it’s going to work. If we run this now we get the next:[*]

                               supply storages                                
┏━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━┓
┃    ┃ title                    ┃ atmosphere variables ┃ packages            ┃
┑━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━┩
β”‚ βœ… β”‚ Chroma                  β”‚                       β”‚ βœ… chromadb>=0.4.13 β”‚
β”‚    β”‚                         β”‚                       β”‚ βœ… tiktoken         β”‚
β”œβ”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ βœ… β”‚ Ragna/DemoSourceStorage β”‚                       β”‚                     β”‚
β”œβ”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ βœ… β”‚ LanceDB                 β”‚                       β”‚ βœ… chromadb>=0.4.13 β”‚
β”‚    β”‚                         β”‚                       β”‚ βœ… tiktoken         β”‚
β”‚    β”‚                         β”‚                       β”‚ βœ… lancedb>=0.2     β”‚
β”‚    β”‚                         β”‚                       β”‚ βœ… pyarrow          β”‚
β””β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                             assistants                             
┏━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━┓
┃    ┃ title                     ┃ atmosphere variables ┃ packages ┃
┑━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━┩
β”‚ ❌ β”‚ OpenAI/gpt-4             β”‚ ❌ OPENAI_API_KEY     β”‚          β”‚
β”œβ”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ ❌ β”‚ OpenAI/gpt-3.5-turbo-16k β”‚ ❌ OPENAI_API_KEY     β”‚          β”‚
β””β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
Enter fullscreen mode

Exit fullscreen mode

So it appears okay, besides we’ve not but configured our OpenAI key. We will rapidly repair that now[*]

export OPENAI_API_KEY="sk-7mxxxxxxxxxxx"
Enter fullscreen mode

Exit fullscreen mode

and after we re-run the instrument[*]

                             assistants                             
┏━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━┓
┃    ┃ title                     ┃ atmosphere variables ┃ packages ┃
┑━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━┩
β”‚ βœ… β”‚ OpenAI/gpt-4             β”‚ βœ… OPENAI_API_KEY     β”‚          β”‚
β”œβ”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ βœ… β”‚ OpenAI/gpt-3.5-turbo-16k β”‚ βœ… OPENAI_API_KEY     β”‚          β”‚
β””β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
Enter fullscreen mode

Exit fullscreen mode

We will now see this appears okay.

Launching the Ragna UI on AWS Cloud9

So as to entry the Ragna UI by way of a browser, we’re going to first have to arrange an ssh tunnel out of your native developer machine (mine is my Macbook), to the AWS Cloud 9 occasion. The steps concerned are:

  • arrange ssh keys for our AWS Cloud 9 occasion
  • open up ssh entry to our AWS Cloud 9 occasion by way of the Inbound Safety teams
  • seize non-public IP and public DNS data for our ssh tunnel
  • setup networking dependencies for the Ragna UI
  • create our ssh tunnel

Looks as if loads, however it’s all fairly straight ahead.

Including an ssh key to an present AWS Cloud9 occasion

If you happen to used the default choices when creating your Cloud9 occasion, you’ll not have an AWS keypair to hook up with your Cloud9 occasion. (You may ignore the subsequent part if you have already got an ssh key you should use for tunnelling)

So as to use ssh tunnelling, that you must arrange an ssh key on the occasion that powers the AWS Cloud9 occasion. To do that I used an present keypair I had for the AWS area, after which generated a public key utilizing the next command:[*]

ssh-keygen -y -f {private-keypair.pem} > {public-key.pem}
Enter fullscreen mode

Exit fullscreen mode

This creates a duplicate of the general public key, which I then linked to the EC2 occasion working my Cloud9 occasion, after which up to date the “~/.ssh/authorized_keys” file, including the important thing. This allowed me to now ssh to this occasion utilizing the non-public key.

Observe! You’ll nned to open ssh by way of the Safety teams, so do not forget to do this. Be sure you slim the scope to solely your IP tackle. You do not need to open it as much as the world. If you don’t do that, then if you try to provoke your ssh tunnel, it can simply “grasp” and ultimately outing.

Organising the native ssh tunnel

Now that I’ve my ssh key, the subsequent factor I have to do is ready up the tunnel on an area terminal session (on the machine you’re working, not the browser based mostly Cloud9).

First we have to seize each the non-public IPv4 tackle of the Cloud 9 occasion, in addition to the general public IpV4 DNS title. The simplest method to do that is to make use of a instrument referred to as “ec2-metadata” which is offered in a bundle we will simply set up with the next command:[*]

sudo apt set up amazon-ec2-utils
Enter fullscreen mode

Exit fullscreen mode

To get the non-public IpV4 tackle we run the next command utilizing the ec2-metadata instrument:[*]

ec2-metadata -o

local-ipv4: 172.31.11.101
Enter fullscreen mode

Exit fullscreen mode

Observe! Another method is to get this data if you do not need to put in this bundle is to run some instructions. The very first thing we have to do is seize the instance-id, which we are going to use in subsequent instructions. You may entry the native occasion meta knowledge this fashion:

wget -q -O – http://169.254.169.254/latest/meta-data/instance-id

i-091a2adc81a4e3391

Now you can use that with the next command to get the non-public IpV4 tackle

aws --region eu-west-3 
ec2 describe-instances 
--filters 
"Identify=instance-state-name,Values=working" 
"Identify=instance-id,Values=i-091a2adc81a4e3391" 
--query 'Reservations[*].Situations[*].[PrivateIpAddress]' 
--output textual content

172.31.11.101
``

We now have to arrange an area community alias on our Cloud 9 occasion, which we are going to configure Ragna to make use of when listening for API requests. From a terminal window on the Cloud 9 occasion, add the comply with entry into your native hosts, changing the IP tackle with the non-public IP you bought from the output of the earlier step:



172.31.11.101 api


Enter fullscreen mode

Exit fullscreen mode

Why? Why are we doing this? We’re doing this to keep away from getting CORS errors when accessing the Chat UI from our native machine.

The subsequent step is to seize the Public IPv4 DNS title of our Cloud 9 occasion. If you happen to put in the ec2-utils bundle, then you possibly can run the next command:



ec2-metadata -p

public-hostname: ec2-13-36-178-111.eu-west-3.compute.amazonaws.com


Enter fullscreen mode

Exit fullscreen mode

You now have every little thing you want: the non-public v4Ip tackle to your Cloud 9 occasion, and the general public v4Ip DNS title to your occasion. We’re good to go to setup our native SSH tunnel.

Observe! If you happen to didn’t set up the ec2-metadata instrument, use this command to get the Public IpV4 DNS title, grabbing the occasion id from earlier steps.



aws ec2 describe-instances --instance-ids i-091a2adc81a4e3391 --query 'Reservations[].Situations[].PublicDnsName'


To do this, you possibly can run the next command, making certain that you just use your ssh key that you just used to configure ssh entry to your occasion, and changing the worth with the IpV4 DNS title of our Cloud 9 occasion :



ssh -i paris-ec2.pem -Ng -L 0.0.0.0:31476:ec2-13-36-178-111.eu-west-3.compute.amazonaws.com:31476 -L 0.0.0.0:31477:ec2-13-36-178-111.eu-west-3.compute.amazonaws.com:31477 ubuntu@ec2-13-36-178-111.eu-west-3.compute.amazonaws.com -v


Enter fullscreen mode

Exit fullscreen mode

Observe! If you don’t use the DNS title of your Cloud9 occasion, and say use the IP tackle, you’ll possible discover that your ssh tunnel doesn’t work.

There may be one closing step earlier than you possibly can check. In your native machine, add the next into your native /and many others/hosts



127.0.0.1       api


Enter fullscreen mode

Exit fullscreen mode

This can ship all requests to http://api by way of the ssh tunnel.

Testing the Ragna UI

There are a few issues we have to do earlier than we will entry Ragna from our browser.

The primary is we have to replace the Ragna configuration file, ragna.toml, with particulars of the api url. We alter the configuration file from the default:



[api]
url = "http://127.0.0.1:31476"
origins = ["http://127.0.0.1:31477"]
database_url = "sqlite:////dwelling/ubuntu/.cache/ragna/ragna.db"
authentication = "ragna.core.RagnaDemoAuthentication"


Enter fullscreen mode

Exit fullscreen mode

to the next:



[api]
url = "http://api:31476"
origins = ["http://127.0.0.1:31477"]
database_url = "sqlite:////dwelling/ubuntu/.cache/ragna/ragna.db"
authentication = "ragna.core.RagnaDemoAuthentication"


Enter fullscreen mode

Exit fullscreen mode

We even have to alter the ui url and origins too, from the default:



[ui]
url = "http://127.0.0.1:31477"
origins = ["http://127.0.0.1:31477"]


Enter fullscreen mode

Exit fullscreen mode

to the next, utilizing the Cloud 9 v4Ip non-public tackle from the earlier steps.



[ui]
url = "http://172.31.11.101:31477"
origins = ["http://172.31.11.101:31477", "http://127.0.0.1:31477"]


Enter fullscreen mode

Exit fullscreen mode

Now that we’ve up to date, that, we will begin Ragna by working the next command:



ragna ui


Enter fullscreen mode

Exit fullscreen mode

Which ought to now begin exhibiting the next data:



INFO:   RagnaDemoAuthentication: You may log in with any username and an identical password.
INFO:     Began server course of [6582]
INFO:     Ready for utility startup.
INFO:     Software startup full.
INFO:     Uvicorn working on http://api:31476 (Press CTRL+C to give up)
INFO:     172.31.11.101:58216 - "GET / HTTP/1.1" 200 OK
Launching server at http://localhost:31477
INFO:huey.shopper:Huey shopper began with 1 thread, PID 6590 at 2023-11-12 12:36:02.478338


Enter fullscreen mode

Exit fullscreen mode

From a browser, now you can open the Ragna UI by accessing “http://localhost:31477“, and we will repeat the steps above and log in, add a doc, after which begin interacting with it utilizing OpenAI.

c9 ragna web ui

As you utilize it, you will note output seem within the logs. After enjoying with it and testing its capabilities, shut it down by going to the Cloud 9 terminal the place you ran the “ragna ui” command, and hit CTRL and C (^C) to interrupt the working course of.



Configuring this with Amazon Bedrock

I’ve began utilizing Amazon Bedrock, so how can we use that with Ragna? As that is open supply, we will check out the documentation and code, and see if we will construct a brand new assistant. Particularly I need to add Anthropic Claude v1 and v2 as choices so I can use these throughout the API and Chat UI. Lets see how we will do that.

The mission does define within the contribution guide tips on how to get began with native growth, so we will begin there. Once we full the steps, we test the model we’re working.

Observe!! A fast be aware earlier than continuing. In my AWS account I had been utilizing Amazon Bedrock’s foundational fashions already, however by default these are usually not all enabled and out there. I needed to particularly allow Anthropic Claude v1 and v2 fashions in my eu-central-1 AWS area, so be sure you do that in any other case if you go to make the API calls, they are going to fail.

I’m going to arrange a brand new Python atmosphere to do that, following the directions from the information.



git clone https://github.com/Quansight/ragna bedrock-ragna
cd bedrock-ragna



Enter fullscreen mode

Exit fullscreen mode

I make one small change, and edit the title of the digital python atmosphere within the environment-dev.yml file (altering the “title: ragna-dev” to “title: bedrock-ragna-dev”, and run this to put in every little thing I would like.



conda env create --file environment-dev.yml
conda activate bedrock-ragna-dev
pip set up --editable '.[all]'
..
<takes about 5-10 minutes to construct/set up>
..


Enter fullscreen mode

Exit fullscreen mode

When can now test we’re working the native model by working this command:



ragna --version
ragna 0.1.2.dev1+g6209845 from /dwelling/ubuntu/atmosphere/bedrock-ragna/ragna


Enter fullscreen mode

Exit fullscreen mode

This appears good, we at the moment are able to see tips on how to get Amazon Bedrock’s basis fashions built-in.

As Amazon Bedrock is new, we have to ensure that we’re utilizing the newest model of the boto3 library. On the time of writing, the model I used to be utilizing is 1.28.79, so when you have something later than this, you ought to be good to go.

From a fast have a look at the code, it appears like I have to make two modifications, each within the assistants listing.



β”œβ”€β”€ _api
β”œβ”€β”€ _cli
β”œβ”€β”€ _ui
β”‚Β Β  β”œβ”€β”€ elements
β”‚Β Β  β”œβ”€β”€ imgs
β”‚Β Β  └── sources
β”œβ”€β”€ assistants
β”‚Β Β  β”œβ”€β”€ __init__.py
β”‚Β Β  β”œβ”€β”€ _anthropic.py
β”‚Β Β  β”œβ”€β”€ _api.py
β”‚Β Β  β”œβ”€β”€ _demo.py
β”‚Β Β  β”œβ”€β”€ _mosaicml.py
β”‚Β Β  └── _openai.py
β”œβ”€β”€ core
└── source_storages



Enter fullscreen mode

Exit fullscreen mode

The primary is to create a brand new assistant within the assistant listing. I created a brand new file referred to as “_bedrock.py” after which added this code:



from typing import solid
import boto3
import json
import os
from ragna.core import RagnaException, Supply

from ._api import ApiAssistant


class AmazonBedrockAssistant(ApiAssistant):
    _API_KEY_ENV_VAR = "BEDROCK_AWS_REGION"
    _MODEL: str
    _CONTEXT_SIZE: int

    @classmethod
    def display_name(cls) -> str:
        return f"AmazonBedRock/{cls._MODEL}"

    @property
    def max_input_size(self) -> int:
        return self._CONTEXT_SIZE

    def _instructize_prompt(self, immediate: str, sources: record[Source]) -> str:
        instruction = (
            "nnHuman: "
            "Use the next items of context to reply the query on the finish. "
            "If you do not know the reply, simply say so. Do not attempt to make up a solution.n"
        )

        instruction += "nn".be part of(supply.content material for supply in sources)
        return f"{instruction}nnQuestion: {immediate}nnAssistant:"

    def _call_api(
        self, immediate: str, sources: record[Source], *, max_new_tokens: int
    ) -> str:
        # Be sure you set your AWS credentials as evnironment variables. Alternatively you possibly can modify this part
        # and alter to the way you name AWS companies
        bedrock = boto3.shopper(
            service_name="bedrock-runtime",
            region_name=os.environ["BEDROCK_AWS_REGION"]
            )
        # See https://docs.aws.amazon.com/bedrock/newest/APIReference/API_runtime_InvokeModel.html
        prompt_config = {
            "immediate": self._instructize_prompt(immediate, sources),
            "max_tokens_to_sample": max_new_tokens,
            "temperature": 0.0
            }

        attempt:
            response = bedrock.invoke_model(
            physique=json.dumps(prompt_config),
            modelId=f"anthropic.{self._MODEL}"
            )

            response_body = json.masses(response.get("physique").learn())

            return solid(str, response_body.get("completion"))
        besides Exception as e:
            increase ValueError(f"Error raised by inference endpoint: {e}")



class AmazonBedRockClaude(AmazonBedrockAssistant):
    """[Amazon Bedrock Claud v2](https://docs.aws.amazon.com/bedrock/newest/userguide/what-is-bedrock.html#models-supported)

    !!! data "AWS credentials required. Please set AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, and AWS_REGION atmosphere variables"


    """

    _MODEL = "claude-v2"
    _CONTEXT_SIZE = 100_000

class AmazonBedRockClaudev1(AmazonBedrockAssistant):
    """[Amazon Bedrock Claud v1](https://docs.aws.amazon.com/bedrock/newest/userguide/what-is-bedrock.html#models-supported)

    !!! data "AWS credentials required. Please set AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY, and AWS_REGION atmosphere variables"


    """

    _MODEL = "claude-instant-v1"
    _CONTEXT_SIZE = 100_000



Enter fullscreen mode

Exit fullscreen mode

Now that we’ve our assistant, the subsequent job is to replace the “init.py” which supplies Ragna particulars of the out there assistants.



__all__ = [
    "Claude",
    "ClaudeInstant",
    "Gpt35Turbo16k",
    "Gpt4",
    "Mpt7bInstruct",
    "Mpt30bInstruct",
    "RagnaDemoAssistant",
    "AmazonBedRockClaude",
    "AmazonBedRockClaudev1"
]

from ._anthropic import Claude, ClaudeInstant
from ._demo import RagnaDemoAssistant
from ._mosaicml import Mpt7bInstruct, Mpt30bInstruct
from ._openai import Gpt4, Gpt35Turbo16k
from ._bedrock import AmazonBedRockClaude, AmazonBedRockClaudev1



# isort: cut up

from ragna._utils import fix_module

fix_module(globals())
del fix_module



Enter fullscreen mode

Exit fullscreen mode

That needs to be it. Lets give it a go.

First we have to configure Ragna utilizing the “ragna init” command. We’ll select comparable choices as to earlier than, the important thing distinction this time is that we must always now hopefully see Amazon Bedrock seem as an possibility.



? Which of the next statements describes greatest what you need to do? (Use arrow keys)
   I need to attempt Ragna with out worrying about any further dependencies or setup.
   I need to attempt Ragna and its builtin supply storages and assistants, which probably require further dependencies or setup.
 Β» I've used Ragna earlier than and need to customise the commonest parameters.

? Which supply storages do you need to use? (Use arrow keys to maneuver, <house> to pick out, <a> to toggle, <i> to invert)
 Β» ● Chroma
   ● Ragna/DemoSourceStorage
   ● LanceDB

? Which assistants do you need to use? (Use arrow keys to maneuver, <house> to pick out, <a> to toggle, <i> to invert)
   β—‹ Anthropic/claude-2
   β—‹ Anthropic/claude-instant-1
   β—‹ Ragna/DemoAssistant
   β—‹ MosaicML/mpt-7b-instruct
   β—‹ MosaicML/mpt-30b-instruct
   β—‹ OpenAI/gpt-4
   β—‹ OpenAI/gpt-3.5-turbo-16k
   ● AmazonBedrock/claude-v2
 Β» ● AmazonBedrock/claude-instant-v1


Enter fullscreen mode

Exit fullscreen mode

We will see that our new suppliers (assistants) are there. I deselected the others in order that solely the 2 Amazon Bedrock suppliers are listed.



You've gotten chosen elements, which have further necessities that arecurrently not met.

To make use of the chosen elements, that you must set the next atmosphere variables: 

- BEDROCK_AWS_REGION

Tip: You may test the provision of the necessities with ragna test.

? The place ought to native information be saved? /dwelling/ubuntu/.cache/ragna


Enter fullscreen mode

Exit fullscreen mode

Don’t be concerned about this, we are going to set this earlier than we run Ragna. Settle for the default worth for this.



? Ragna internally makes use of a activity queue to carry out the RAG workflow. What sort of queue do you need to use? (Use arrow keys)
   reminiscence: Every little thing runs sequentially on the principle thread as if there have been no activity queue.
 Β» file system: The native file system is used to construct the queue. Beginning a ragna employee is required. Requires the employee to be run on the identical machine as the principle thread.
   redis: Redis is used as queue. Beginning a ragna employee is required.

? The place do you need to retailer the queue information? /dwelling/ubuntu/.cache/ragna/queue

? At what URL would you like the ragna REST API to be served? http://127.0.0.1:31476

? Do you need to use a SQL database to persist the chats between runs? Sure

? What's the URL of the database? sqlite:////dwelling/ubuntu/.cache/ragna/ragna.db

? At what URL would you like the ragna internet UI to be served? http://127.0.0.1:31477

And with that we're achieved πŸŽ‰ I am writing the configuration file to /dwelling/ubuntu/atmosphere/bedrock-ragna/ragna/ragna.toml.


Enter fullscreen mode

Exit fullscreen mode

Okay, so we now have our configuration file, which we will test by opening up “ragna.toml”



local_cache_root = "/dwelling/ubuntu/.cache/ragna"

[core]
queue_url = "/dwelling/ubuntu/.cache/ragna/queue"
doc = "ragna.core.LocalDocument"
source_storages = ["ragna.source_storages.Chroma", "ragna.source_storages.RagnaDemoSourceStorage", "ragna.source_storages.LanceDB"]
assistants = ["ragna.assistants.AmazonBedRockClaude", "ragna.assistants.AmazonBedRockClaudev1"]

[api]
url = "http://127.0.0.1:31476"
origins = ["http://127.0.0.1:31477"]
database_url = "sqlite:////dwelling/ubuntu/.cache/ragna/ragna.db"
authentication = "ragna.core.RagnaDemoAuthentication"

[ui]
url = "http://127.0.0.1:31477"
origins = ["http://127.0.0.1:31477"]


Enter fullscreen mode

Exit fullscreen mode

We’re going to make a couple of modifications in order that we will entry this by way of our browser, particularly the api and ui modifications, in order that we find yourself with the next:



local_cache_root = "/dwelling/ubuntu/.cache/ragna"

[core]
queue_url = "/dwelling/ubuntu/.cache/ragna/queue"
doc = "ragna.core.LocalDocument"
source_storages = ["ragna.source_storages.Chroma", "ragna.source_storages.RagnaDemoSourceStorage", "ragna.source_storages.LanceDB"]
assistants = ["ragna.assistants.AmazonBedRockClaude", "ragna.assistants.AmazonBedRockClaudev1"]

[api]
url = "http://api:31476"
origins = ["http://127.0.0.1:31477"]
database_url = "sqlite:////dwelling/ubuntu/.cache/ragna/ragna.db"
authentication = "ragna.core.RagnaDemoAuthentication"

[ui]
url = "http://127.0.0.1:31477"
origins = ["http://127.0.0.1:31477"]


Enter fullscreen mode

Exit fullscreen mode

If we now do a “ragna test” we will see the next:



                               supply storages                                
┏━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━┓
┃    ┃ title                    ┃ atmosphere variables ┃ packages            ┃
┑━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━┩
β”‚ βœ… β”‚ Chroma                  β”‚                       β”‚ βœ… chromadb>=0.4.13 β”‚
β”‚    β”‚                         β”‚                       β”‚ βœ… tiktoken         β”‚
β”œβ”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ βœ… β”‚ Ragna/DemoSourceStorage β”‚                       β”‚                     β”‚
β”œβ”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ βœ… β”‚ LanceDB                 β”‚                       β”‚ βœ… chromadb>=0.4.13 β”‚
β”‚    β”‚                         β”‚                       β”‚ βœ… tiktoken         β”‚
β”‚    β”‚                         β”‚                       β”‚ βœ… lancedb>=0.2     β”‚
β”‚    β”‚                         β”‚                       β”‚ βœ… pyarrow          β”‚
β””β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                                assistants                                 
┏━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━┓
┃    ┃ title                            ┃ atmosphere variables ┃ packages ┃
┑━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━┩
β”‚ ❌ β”‚ AmazonBedRock/claude-v2         β”‚ ❌ BEDROC_AWS_REGION         β”‚          β”‚
β”œβ”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ ❌ β”‚ AmazonBedRock/claude-instant-v1 β”‚ ❌ BEDROCK_AWS_REGION         β”‚          β”‚
β””β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜


Enter fullscreen mode

Exit fullscreen mode

Lets repair this by setting the area of the Amazon Bedrock API we need to entry. As I’m the UK, I’ll use Frankfurt, eu-central-1.



export BEDROCK_AWS_REGION=eu-central-1


Enter fullscreen mode

Exit fullscreen mode

And now after I run “ragna test” I get the next:



                               supply storages                                
┏━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━┓
┃    ┃ title                    ┃ atmosphere variables ┃ packages            ┃
┑━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━┩
β”‚ βœ… β”‚ Chroma                  β”‚                       β”‚ βœ… chromadb>=0.4.13 β”‚
β”‚    β”‚                         β”‚                       β”‚ βœ… tiktoken         β”‚
β”œβ”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ βœ… β”‚ Ragna/DemoSourceStorage β”‚                       β”‚                     β”‚
β”œβ”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ βœ… β”‚ LanceDB                 β”‚                       β”‚ βœ… chromadb>=0.4.13 β”‚
β”‚    β”‚                         β”‚                       β”‚ βœ… tiktoken         β”‚
β”‚    β”‚                         β”‚                       β”‚ βœ… lancedb>=0.2     β”‚
β”‚    β”‚                         β”‚                       β”‚ βœ… pyarrow          β”‚
β””β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                                assistants                                 
┏━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━┓
┃    ┃ title                            ┃ atmosphere variables ┃ packages ┃
┑━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━┩
β”‚ βœ… β”‚ AmazonBedRock/claude-v2         β”‚ βœ… BEDROCK_AWS_REGION         β”‚          β”‚
β”œβ”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ βœ… β”‚ AmazonBedRock/claude-instant-v1 β”‚ βœ… BEDROCK_AWS_REGION         β”‚          β”‚
β””β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜


Enter fullscreen mode

Exit fullscreen mode

Aspect be aware I did increase an issue after which a PR which you’ll take a look at. It’s in all probability price checking again with the mission as soon as they’ve constructed a system that enables further assistants to be plugged in.

We will now begin Ragna



ragna ui
INFO:   RagnaDemoAuthentication: You may log in with any username and an identical password.
INFO:     Began server course of [16252]
INFO:     Ready for utility startup.
INFO:     Software startup full.
INFO:     Uvicorn working on http://api:31476 (Press CTRL+C to give up)
INFO:     172.31.11.109:39566 - "GET / HTTP/1.1" 200 OK
Launching server at http://localhost:31477


Enter fullscreen mode

Exit fullscreen mode

after which open up a browser and open up http://localhost:31477 (remembering to make sure our ssh tunnel is up and working). We will now add a brand new chat, and we see that we’ve Amazon Bedrock as an possibility which we are going to choose.

bedrock on c9

As soon as the doc has been uploaded, now you can work together with the doc.

running bedrock on c9 queries

You may see data in your terminal that can look just like the next.



INFO:     172.31.11.109:33096 - "GET /doc?title=amazon-mwaa-mg.pdf HTTP/1.1" 200 OK
INFO:     172.31.11.109:33096 - "POST /doc HTTP/1.1" 200 OK
INFO:     172.31.11.109:33100 - "POST /chats HTTP/1.1" 200 OK
INFO:huey:Executing ragna.core._queue._Task: f1f33e37-c54f-4cb8-bc3c-da0f46c95e60
INFO:huey:ragna.core._queue._Task: f1f33e37-c54f-4cb8-bc3c-da0f46c95e60 executed in 3.652s
INFO:     172.31.11.109:33100 - "POST /chats/9da85d22-a692-4ebb-a8ca-928c878a11fa/put together HTTP/1.1" 200 OK
INFO:     172.31.11.109:33100 - "GET /chats HTTP/1.1" 200 OK
INFO:huey:Executing ragna.core._queue._Task: 6f6cdb8f-ca93-4367-be21-89eb708779d1
INFO:huey:ragna.core._queue._Task: 6f6cdb8f-ca93-4367-be21-89eb708779d1 executed in 0.098s
INFO:huey:Executing ragna.core._queue._Task: 5b071f0e-f078-4d12-bf5b-f4f116dcdd02 2 retries
INFO:botocore.credentials:Discovered credentials in shared credentials file: ~/.aws/credentials
INFO:huey:ragna.core._queue._Task: 5b071f0e-f078-4d12-bf5b-f4f116dcdd02 2 retries executed in 8.216s
INFO:     172.31.11.109:52654 - "POST /chats/9da85d22-a692-4ebb-a8ca-928c878a11fa/reply?immediate=Whatpercent20versionspercent20ofpercent20Apachepercent20Airflowpercent20arepercent20mentionedpercent20inpercent20thispercent20documentpercent3F HTTP/1.1" 200 OK


Enter fullscreen mode

Exit fullscreen mode

Congratulations, you’ve gotten now configured Ragna to make use of the Claudev2 giant language mannequin inside Ragna. You may attempt with Claudev1 fashions if you would like, and take a look at totally different paperwork and queries.



Conclusion

On this weblog submit we launched a brand new open supply RAG orchestration framework referred to as Ragna, and confirmed you how one can get this up and working in your native machine, or up on AWS working in an AWS Cloud 9 developer atmosphere. We then checked out including a brand new giant language mannequin, integrating Amazon Bedrock’s Anthropic Claude v1 and v2 fashions.

Ragna continues to be in energetic growth, so for those who like what you noticed on this submit, why not head over to the mission and provides it some love. I’d love to listen to from any of you who do that out, let me know what you suppose.

That’s all people, however I will probably be again with extra open supply generative AI content material quickly. When you have an open supply mission on this house, let me know, as I all the time on the hunt for brand new tasks to take a look at and stand up and working on AWS.

Earlier than you permit, I’d respect for those who may provide me some feedback on what you considered this submit.

I’ll depart you with a few of the points and errors I encountered alongside the best way, these may assist a few of you for those who run into the identical points.



Dealing with errors

I bumped into a couple of errors while I used to be enjoying round with getting conversant in Ragna and integrating Amazon Bedorck’s basis fashions. I assumed I’d share these right here in case they assist folks out.

Dangling thread

Once I was enjoying with the Ragna API, I bumped into an issue whereby after I was utilizing Jupyter to work together with the API, after which tried to restart the Ragna UI, I couldn’t. It will begin okay, and there have been no errors generated, however the Ragna UI wouldn’t work. I may log in however then nothing.

I initially cleared out the prevailing short-term knowledge (eradicating every little thing in ~/.cache/ragna) after which restarting. At this level, I may then now not log in, with sqlite errors seem. Trying again within the ~/.cache/ragna listing I may see that the info information weren’t being created.

After messing about, I realised that there was nonetheless an open thread holding open a port, which was inflicting the Ragna ui to fail to start out cleanly. Operating this command



lsof -nP -iTCP -sTCP:LISTEN | grep 31476


Enter fullscreen mode

Exit fullscreen mode

Allowed me to establish after which kill the offending thread. As soon as I did this, Ragna restarted as anticipated.

Chromadb and sqlite3

When utilizing Ragna that you must set up the Python library, chromadb (“pip set up chromadb>=0.4.13”) which seems to put in. Nonetheless, if you go to make use of it, you get the next error:



Traceback (most up-to-date name final):
  File "<stdin>", line 1, in <module>
  File "/dwelling/ec2-user/atmosphere/ragna/lib/python3.10/site-packages/chromadb/__init__.py", line 78, in <module>
    increase RuntimeError(
RuntimeError: Your system has an unsupported model of sqlite3. Chroma                     requires sqlite3 >= 3.35.0.
Please go to                     https://docs.trychroma.com/troubleshooting#sqlite to find out how                     to improve.


Enter fullscreen mode

Exit fullscreen mode

So as to get this working, I switched to utilizing Conda to offer my digital Python atmosphere. I did try to comply with the hyperlinks supplied, and misplaced many hours attempting to get this working (with out being too hacky and simply disabling the test!). With Conda it simply labored first time. I’ll come again to this when I’ve extra time, to see if I can get a correct repair.

error: subprocess-exited-with-error

If you happen to get this error, which I obtained a couple of instances, it was right down to a compatibility concern with newer variations of pip. When putting in pydantic libraries, I obtained the next errors:



   error: subprocess-exited-with-error

 Γ— Making ready metadata (pyproject.toml) didn't run efficiently.
  β”‚ exit code: 1
 ╰─> [6 lines of output]



Enter fullscreen mode

Exit fullscreen mode

Operating “pip set up pip==21.3.1” resolved this concern. This may get fastened by the point you’re studying it, however together with this as a reference:

Hat tip to this StackOverflow answer

[*]

Add a Comment

Your email address will not be published. Required fields are marked *

Want to Contribute to us or want to have 15k+ Audience read your Article ? Or Just want to make a strong Backlink?