Sending Logs to Oracle Management Cloud using REST API

Sending logs to Oracle Management Cloud using REST API is the first in a series of blogs designed to show the flexibility in options for uploading logs to Oracle Management Cloud Log Analytics. Be sure to read the rest of the blog series to see the other methods that can be used to send logs to Log Analytics.

Upload from REST API

Since just about everything in Oracle Management Cloud can be done via API calls, there’s also an option to build this API call into a script. This can be done for example if you need to pull logs from a SaaS application by API, then send them to Log Analytics. You can do this by storing data in between in a log file, or write a script that will pull and send immediately. You can also upload json using this method. Uploading as json would allow you to automate the GET from the source API and the POST to Log Analytics. While this is a great method for testing, and may be necessary for some SaaS environments, you want to be sure to automate this as much as you can so that you don’t have gaps or missing log data.

If you’re uploading as data (files), they can also be uploaded in archive formats listed below:

  • zip
  • tar 
  • tar.gz
  • tgz
  • gz
  • zip (can contain gz and tar)

Uploading by API is done by performing a POST to the ../serviceapi/logan.uploads followed by a few attributes to tell it what entity to store the data against, and what log source to parse with.

  • uploadName = a simple name that will show up in the UI to “group” this upload set
  • entityName = an existing Entity Name (host, database, etc), will be created if it doesn’t exist
  • entityType = to classify valid Log Sources (Host (Linux), Oracle Database Instance, etc.)
  • logSourceName = an existing Log Source found in OMC (Linux Syslog Logs, Oracle Database Alert Logs, etc.)

Be sure to include the %20 to encode spaces in the attributes. So the syntax is as follows:

$ curl -s -k -u '<user>'-X POST -H 'X-USER-IDENTITY-DOMAIN-NAME:<tenant>'-F 'data=@<file>'"https://<url>/serviceapi/logan.uploads?uploadName=<upload name>&entityName=<entity name>&entityType=<entity type>&logSourceName=<log source>"

Here’s an example of uploading the database alert log.xml file to a tenant (mytenant) and Oracle Management Cloud instance (myomc) using the upload name of Test, and loading log.xml file against the Database Alert Log log source to the database instance named abc.

$ curl -s -k -u '' -X POST -H 'X-USER-IDENTITY-DOMAIN-NAME:mytenant' -F 'data=@log.xml' ""

We get prompted to enter the password, and get the following returned (data has been falsified for publishing:

Enter host password for user '':
  "uploadId" : 8363494239047176454539,
  "instanceId" : 8387325151653157416698,
  "status" : "inProgress",
  "startedBy" : "",
  "startedOn" : "2020-03-24T18:48:11.763Z",
  "totalChunks" : 0,
  "chunksProcessed" : 0,
  "requestGUID" : "3acff096-f22f-6c4f4-6761-7dfff28f06cb",
  "files" : [ {
    "fileName" : "log.xml",
    "sourceId" : -9114482389049143451,
    "status" : -1,
    "totalChunks" : 2,
    "chunksSuccess" : 0,
    "sourceName" : "Database Alert Logs",
    "entityName" : "abc",
    "entityType" : "Oracle Database Instance",
    "entityId" : "88005C97CF157BAD0C0CA4AS89A7CD2E"
  } ],
  "canonicalLink" : "v1/logdata/uploads/8363494239047176454539/instances/8387325151653157416698"

Viewing Uploads

So now that we’ve manually uploaded a log by API, let’s go to the Oracle Management Cloud console to see the status and view our data. From Log Analytics > Click on Log Admin, then click on Uploads.

This will display the latest upload jobs as named by the uploadName parameter:

Upload Sets

When you drill down to that upload set, you’ll see the list of files and their status, entity and log source listed.

Files in Upload Set

From the upload listing or within the upload you can select the action menu on the right, and go directly to Log Explorer.

Upload Menu

Finally, we can see our logs in Log Explorer just like any logs collected directly from an agent.

Logs in Log Explorer

As I mentioned at the beginning of this post, this method is great for testing but unless you are going to script this upload into a larger process, it’s not the best for continuous delivery. You’ll need to be sure the script is scheduled and executes consistently to have the best available data.