Skip links

Writing Python documentation with Cohere’s Generative AI

Python doc with cohere

Occasionally, I will write a script that customers can use. When I do this, they often ask for documentation to accompany the script, which leads to more time spent than intended. At the same time, documentation is always a good thing to do in both your code and as a formal document for the end user. To that end, I have always sought a quick way to generate script documentation. Yeah, all kinds of documentation generators use comments and tags to produce documentation, but I was looking for something else.
Earlier in the year, I started to play with Chat-GPT. Then, I went to Oracle Cloud World and came across Cohere. I finally found a way between both AI platforms, although not 100% in documentation style, to write documentation by simply prompting the code to an AI. Suppose you didn’t catch it at Oracle Cloud World. In that case, Oracle is partnering with Cohere and embedding “generative AI” into many of its applications, hence why I have liked Cohere over OpenAI.

How does Cohere help me with my documentation?

If you spend time working or playing with Cohere’s Playground, you can quickly see how the AI responds to simple requests to produce or generate text-based content. At the same time, Cohere provides you with code samples on how to run their Generative AI tool from various languages. In my case, I wanted to see how it looked with Python, and the code looked similar to this:

Note: you will need to get an Trail API key from Cohere

import cohere
co = cohere.Client(‘API Key') # This is your trial API key
response = co.generate(
model='command',
prompt=<prompt>
max_tokens=3654,
temperature=0.9,
k=0,
stop_sequences=[],
return_likelihoods='NONE')
print('Prediction: {}'.format(response.generations[0].text))

As you can tell, it is straightforward code to interact with the Cohere Generative AI.

How did I use it to write documentation?

Using the same code above, I had to figure out how to pass a file to the AI or at least the contents of the file. As I gave this some thought, all I had to do was use two files – one as the read file and one as the write file. The read file is the Python code I wrote, and the write file is the Microsoft Word document I wanted.

This resulted in a Python file that looks like this:

import cohere
co = cohere.Client(‘API Key') # This is your trial API key

with open(“check_servicemanager.py”, “r”) as code_file:
data = code_file.read()

doc_file = open(“/docfile.doc”, “w”)

response = co.generate(
model=’command’,
prompt=data,
max_tokens=2500,
temperature=0.5,
k=0,
stop_sequences=[],
return_likelihoods=’NONE’)

doc_file.write(response.generations[0].text)
#print(‘Prediction: {}’.format(response.generations[0].text))
doc_file.close()
code_file.close()

A couple of things to notice:

1. I opened a file called “check_servicemanager.py” as read-only. (this is the Python code I wanted to document)
2. The data variable is set to the contents of “check_servicemanager.py” in string format
3. The file used for writing is a “.doc” file. This allows for the AI to write to a Microsoft Word document. The downside here is that when opening this file when done, there may be some compatibility issues, especially on Mac OS. The AI will write it out as a text file usually.
4. Within the co.generate block, the “prompt” is how the AI knows what to do. In this case, I set this to the variable data. Prompting the AI to use “check_servicemanager.py” as input.
5. Lastly, the code will write the response from the AI to the desired file as standard text.

What was the output?

You may be wondering what output was generated into the write file. This was simply an evaluation of the code as provided below:

The code you provided is a Python script that checks the deployment status of a GoldenGate ServiceManager. It does this by requesting the GoldenGate server’s API and retrieving the status of the ServiceManager deployment. The script then prints the status to the console.
Here is an explanation of the code:
– The code starts with importing the required packages: `requests`, `json`, `os`, `sys`, and `time`. It then disables warnings and defines a function to handle control-c interruptions.
– The `deployment_status` function makes a request to the GoldenGate server’s API to retrieve the status of the ServiceManager deployment. It then prints the status to the console.
– The `main` routine sets the variables for the GoldenGate server’s URL, administrator username, and password. It then sets a signal handler for control-c interruptions and calls the `deployment_status` function with the ServiceManager deployment name.
– The `sig_handler` function is a signal handler that is defined to handle control-c interruptions. It prompts the user to confirm if they want to exit and exits the program if the user confirms.
The code appears to work as expected, but it is difficult to determine the specific behavior without the deployment name and GoldenGate server URL. It is also important to ensure that the GoldenGate server’s API is accessible and that the administrator username and password are correct.

At the same time, the generated output doesn’t quite fit what you wanted for documentation; I could have run the code multiple times to get different results. I found that changing the “temperature” caused different results based on the randomness desired by changing this setting.

Conclusion

Using AI greatly helped me get a start on writing documentation for scripts and other items. At the same time, I still needed to adjust the output based on what I wanted documentation to say. Using AI to speed up documentation writing is good and flexible, yet it still requires a human to put some brain power into their documentation.

Leave a comment