Resolving Log Analytics File Permissions

If you’ve just started using Oracle Management Cloud Log Analytics service, you might have noticed some files that you expect to see are not being loaded into the cloud.  Most likely this is because your agent does not have the right permissions to read the file.

Identifying Permission Warnings

In the Log Analytics dashboard, the bell icon indicates Warnings or notifications for the administrator.


If you click on this icon, you’ll get the breakdown of files that the Cloud Agent either can’t find or can’t read.   The file warnings are broken down by Top Targets and Top Sources.

Log Analytics Warnings

We’re going to take a look at the error that has to do with permissions, as this is most likely what you’ll find when setting up your log sources.  The agent owner needs to have read permissions on a file.  For some of the root level files, such as syslog and messages, you may need to grant permissions explicitly to that agent user.  You can see in the image below, the messages files exist, but I don’t have permissions to read them.


Resolving Permissions Errors

ACLs allow for a much granular method of access privileges then your standard chown/chmod does. If you don’t have the setfacl command available, you might need to install the acl package.  You can do that by issuing the following command as a root user:  yum install acl.    If that is not an option, the standard chmod method of granting read access will work as well.

As a root user use the setfacl command to add read permissions to the agent owner for these files. In my example, the agent owner is “cllamas”.


Now when I switch back to the agent owner, I can successfully read the file.


Refreshing my dashboard automatically clears the warning message and updates the Top Targets charts.


Since I don’t have Apache HTTP running I can Suppress the other errors.  I can fix this permanently by disassociating the Apache log source under Configuration.



Sending Data to Oracle Management Cloud

Sending Data to Oracle Management Cloud

Updating and reposting to reflect current state as of today for agent install and tenants.

Lately I’ve been busy helping customers deploy the new Oracle Management Cloud (OMC) services and discover what it can do for them.   One of the most frequent questions I get when talking with new customers about OMC is how do we collect the data and send it to Oracle’s Cloud?  Using Agents installed on-premises or in the Cloud, we can collect various points of data to be used in the many services that make up Oracle Management Cloud.  The goal of this blog is to introduce you to our points of communication and explain where we get data and how it’s used.

It’s important to note, all communication is outbound from Agents to Oracle Cloud.   There is no inbound communication.  All agent communication is over HTTPS.  If your systems have direct access to the internet, they can communicate directly.  Most enterprise customers will likely configure a proxy or firewall/ACL changes to allow communication over certain servers and ports.

Each tenant will have a unique URL that the agents would access, typically it would look like:


For example:

You must be able to reach this URL.  Whether it’s via proxy server, or firewall, the agent does not care so much. Just that it can communicate.   To test this, you can use curl to see if your particular server can reach the tenant.

curl -I –tlsv1.2

The rest of this blog post will discuss the various agent components and how they communicate with each other in a typical Oracle Management Cloud deployment.



As with any systems monitoring tool, there are agents that help with the collection and processing of data.     Each of the agents will be discussed below in regards to what services

Oracle Management Cloud architecture


The Gateway is an optional component and is used to buffer and send all data from Cloud Agents, APM Agents and Data Collectors to the Oracle Cloud.   This way, only one server has to have internet access.  Due to the amount of activity for buffering data, this may require a standalone machine as the CPU utilization can be up to 20% for large environments depending on the number of agents and volume of data being sent.    All communication from Gateway agent to Oracle Cloud is over HTTPS (port 443) and is outbound only.

Cloud Agent

The Cloud Agent is responsible for collecting log, metrics, performance and configuration to be used in the various services (Log Analytics, IT Analytics, Infrastructure Monitoring, Configuration & Compliance, etc).  It is also responsible for executing remediation actions or instructions per Orchestration service.  This agent will sit on the target host, whether on-premises or in a Cloud.  The agent can communicate to OMC directly, via a proxy, or the Gateway.   The Cloud agent communicates to the Cloud over HTTPS (port 443) directly, or to the Gateway agent over the port the Gateway was installed on (default is 4459).

APM Agent

When using Application Performance Monitoring (APM), you’ll be using the APM agent which is installed on the server where the application runs. Whether the application is Java, .Net or Node.js, the agent collects and sends performance data about the application to OMC.  This agent can communicate directly, via a proxy, or through the Gateway (over HTTPS on Gateway port).  As APM also collects End User Metric data from the application users browser, that data is sent directly to OMC.

Data Collector

There is also an optional Data Collector component for users who already have Oracle Enterprise Manager (OEM) configured.  The Data Collector extracts target properties, associations, metrics, incidents, version and additional configuration data about the targets in OEM and shares this with OMC.   The Data Collector talks to OMC directly over HTTPS or via the Gateway (over HTTPS on Gateway port).  You can have multiple Data Collectors if you have more than one OEM configured.  This is a great opportunity to consolidate data into one place if your OEM is separated by organization, geography or lifecycle status.


Each agent has a zip file that can be downloaded from a static link or via the OMC Console.   Installation parameters are set in an agent.rsp file and then an install script is called:

$ ./ agent.rsp

As each agent has a different purpose, their install parameters are slightly different.   Details can be found in the installation docs referenced at the end of this blog.

Maintaining Agents

Since Oracle Management Cloud is an agile product with monthly releases, the agents may be updated monthly as well.  This does not mean you have to upgrade your agents monthly, but upgrading quarterly will keep your agents in top shape.    For Gateway, Cloud and Data Collector agents, when you’re ready to upgrade simply go to the Agents administration page and select Upgrade.  This will instruct your agent to download the latest agent and perform an out-of-place upgrade seamlessly.  If the upgrade fails for some reason, your original agent will remain active.

APM agents must be upgraded manually as they’re tightly integrated with the application.   These should be built into your application update cycles.


As you can see, communication to Oracle Management Cloud has a couple of options to meet different needs.  All communication will go over https and can communicate directly to the cloud or through a centralized gateway.  If you’re looking for additional information, you may find these sources helpful:

An Introduction to Oracle Management Cloud Log Analytics

I know it’s been a while since I’ve blogged about anything, but I have good reason!  I’ve been busy working with a new service offering that Oracle has launched, called Oracle Management Cloud.   I gave a quick glimpse earlier this year of the three services that were launched – Log Analytics, Application Performance Monitoring and IT Analytics.   Since then we’ve also added Infrastructure Monitoring.   Right now, I’m going to talk a little more about Log Analytics and how you can get started using it.

Log Analytics

From application managers to system admins to DBAs, we all have to look at log files. If you have multiple servers, it can take hours to gather all the relevant logs and comb through them, correlating times and searching for error messages.  Log Analytics can make this process simpler and automated.  By loading all related log files to the Oracle Cloud, you can search through logs from multiple targets, and store terabytes of data. Gone are the days of being approached by someone weeks after an issue happened and asked for another log file only to find out that the logs have rolled over already.

The beauty of Log Analytics is that instead of making you create all your own parsers and log sources, Oracle has provided some very common parsers and sources out-of-the-box.  This list continues to grow every  month.  You can get started immediately for Oracle Database (including audit, trace alert, listener, asm), WebLogic, Tomcat, Windows Events, Linux, Solaris, Apache and more.  But what if you have another product, like LifeRay portal supporting your PeopleSoft application that you want to collect logs for?  Easy.   Log Analytics is very flexible and allows you to create a new parser specific to any kind of log you want to read and pull that data in alongside your other logs for quick analysis and troubleshooting.

Getting to Know Log Analytics

Let’s breakdown the Log Analytics page and take a look at what we can do. At the top we have our search bar which allows us to search using keywords and phrases.  It also shows you the recent searches when you click in the search bar, so you can select one from your history and rerun it.   To the right, you have options to Save, Open and Configure.  Save allows you to save your search as a custom widget.  This can be added in a dashboard, we’ll talk more about that later. Open allows you to open a previously saved search.  Configure takes you to the configuration options for log sources and parsers. We’ll talk about this more in another blog.  The Run button executes your search.

Log Analytics Search Bar

Also in this section you have the time selector.  This window expands to give you many pre-defined time windows, or allow for a custom time window.

Log Analytics Time Selector

The Data panel on the left has all the properties and fields.


We can filter on any of these fields to find a specific entity type, entity, severity or error ID.  You can also filter on Groups or Systems (created within OMC or imported from Oracle Enterprise Manager). When you click on one of the fields, you’ll be presented with the list of entries, as well as a chart that shows the trend of logs associated with that entry over the time period you have selected.


The Visualize panel allows us specify which fields and how we want to view the data.  Pie chart, records with a histogram, heat map, tile, sunburst… An option for any type of query.


Then you have your chart.  By default you’ll see a pie chart of logs by log source over the last 60 minutes.


Putting it to Work

Now that we’ve talked about the layout and features of Log Analytics, let’s take a look at some of the ways we can use our log files.  If we select a certain type of logs, let’s say PeopleSoft Application Tuxedo Access Logs (shown here), and with a couple of clicks drill down, we’re presented with the histogram view of this particular log type, and a time sequence of the entire set of associated log files.  From here I can continue to drill down or filter based on specific fields or time.


One thing that’s important to note in the picture above, is that all timestamps are normalized to UTC.  This can be a great help when you’re consolidating logs from different applications and servers across the globe that have different timezones.  With all log files automatically consolidated and aligned in time-order, I can now see for example, when someone logged in and changed permissions or my transaction rate dropped.

One of my favorite LA features is what I call the needle in the haystack finder, aka clustering.  Let’s say we take a look at our Database Listener logs for the last 7 days.  Looking at the chart below, you can see we have 7m+ entries.


Now that’s a lot of logs to look through!  In fact, I really can’t remember the last time I found anything good in the Listener logs.  Always hated those.  Wouldn’t it be great if we could see what was clogging up the logs, or if there’s really anything important in there?  A lot of time, you’ll see connection errors sporadically and they don’t mean anything.  But how do you know if they are safe to ignore or if they’re repeating frequently?

I’m  going to use the Cluster command to show OMCs powerful machine learning.  I can do that by just adding |cluster to the search string, or selecting Cluster from the Visualize dropdown.  Now I see that my 7 million entries comprises of 25 unique message signatures.  This is much easier to digest and identify problems.  You see that the top error in this case is TNS-12514 error, and it happens to coincide (as seen in the Trend chart) with the connection established from the exaboard user.  Now I have something to go on.  In just a few clicks I’ve found a problem with the service_name being used by this user.


Again, this can be applied to all types of log files.   Database, Middleware, Host, Application…  just to name a few.  I’ve seen customers identify changes to filesystems, invalid user attempts and more with just a few clicks.

Hopefully you’ve enjoyed this quick introduction to Log Analytics.  I’ll be back with more details on how to use this in everyday tasks, as well as how to create custom log sources and parsers.

To learn more about Oracle Management Cloud go to or contact me here.




Deinstalling Oracle Enterprise Manager 13c in One Step

Not that you’d want to deinstall Oracle Enterprise Manager (OEM) 13c, but I just have to tell you how easy it is now!  This was one of my  pet peeves in OEM 12c.  Deinstalling had a list of about 20 steps, and the order was very particular, as was the alignment of the sun and moon.  Not anymore, in OEM 13c, it’s one command and it’s beautiful.   Really, it’s the simple things that make me happy!

This may not be a common concern for most people, because once they install they just continue to upgrade and move forward.  However when you do a lot of testing (especially destructive testing) and you have to rebuild your environment frequently, this is a big deal.

Want to see how easy it is?  First, be sure to start with your environment up (repository database and OMS).

Copy the deinstall script to a temporary location…

$cp /u01/oracle/m13c/oraclehome/sysman/install/ /home/cllamas

Then run it.  Be sure you have your passwords and Oracle Home handy.

$/u01/oracle/em13/perl/bin/perl /home/cllamas/ -mwHome /u01/oracle/em13 -stageLoc /home/cllamas

That’s it.  All done.  Agent, OMS (including BI Publisher), and the schema from the repository database.   The database is not removed, just cleaned up and ready to start a new install 😉

Again.  It’s the simple things that make me happy.   For more information checkout the deinstall docs, there’s additional ways to deinstall certain components, but when you’re looking for a clean sweep this is the trick.

Output of deinstall:
Refer to /home/cllamas/deinstall_2016-08-01_08-42-48.log for deinstall log
This is a First OMS install. So, this deinstalls the OMS , Repository and Agent. Confirm (y/n)y
User confirmed for deinstallation.

Enter the SYS Password :
Enter the sysman Password :
Enter the Admin Server password :

The command is /u01/oracle/em13/bin/emctl stop oms

Stopping oms………… Wait for the completion of the execution.
Oracle Enterprise Manager Cloud Control 13c Release 1
Copyright (c) 1996, 2015 Oracle Corporation. All rights reserved.
Stopping Oracle Management Server…
Oracle Management Server Already Stopped
Oracle Management Server is Down
JVMD Engine is Down
return value is : 0

The command is /u01/oracle/em13/bin/omsca delete -full -OMSNAME EMGC_OMS1 -AS_USERNAME weblogic -REP_CONN_STR “(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(”

The oms delete will take sometime. Wait for the completion of the execution. Don’t abort the execution.

Oracle Enterprise Manager Cloud Control 13c Release
Copyright (c) 1996, 2015, Oracle. All rights reserved. Do You really want to delete the OMS (Y|N):Deleting BI Publisher Server Named “BIP” Enter Administration Server user password:
Processing command line ….
Repository Creation Utility – Checking Prerequisites
Checking Global Prerequisites
Repository Creation Utility – Checking Prerequisites
Checking Component Prerequisites
Repository Creation Utility – Drop
Repository Drop in progress.
Percent Complete: 22
Percent Complete: 47
Percent Complete: 49
Percent Complete: 100
Repository Creation Utility: Drop – Completion Summary
Database details:
Connected As : sys
Prefix for (prefixable) Schema Owners : SYSMAN
RCU Logfile : /u01/oracle/em13/cfgtoollogs/cfgfw/emsecrepmgr.log
Component schemas dropped:
Component Status Logfile
Oracle Platform Security Services Success /u01/oracle/em13/cfgtoollogs/cfgfw/opss.log

Repository Creation Utility – Drop : Operation Completed
OMS Deleted successfully

return value is : 0
Running the command /u01/oracle/em13/sysman/admin/emdrep/bin/RepManager -action drop -dbUser sys -dbRole sysdba -connectString “(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(” -mwHome /u01/oracle/em13 -mwOraHome /u01/oracle/em13 -oracleHome /u01/oracle/em13

Repository drop will take sometime. Wait for the completion of the execution. Don’t abort the execution.

stty: standard input: Inappropriate ioctl for device
stty: standard input: Inappropriate ioctl for device
stty: standard input: Inappropriate ioctl for device
stty: standard input: Inappropriate ioctl for device
processing arguments
compiling arguments for validation
Enter sys user password :
Verify :
Performing PreDropAll action…
Enter password for: sys process_id:1p5c6aeconlq4
Done PreDropAll action…
Dropping BIP schema…
Enter sys user password :
Verify :
Action on BIP schema succeed.
Dropping APM schema… Component is already dropped Enter sysdba password:
Dropping OPSS schema… Component is already dropped Enter sysdba password:
[01-08-2016 08:45:11] executing query : “select count(*) from all_users where username=’SYSMANUPGR_OPSS'”
[01-08-2016 08:45:12] User SYSMANUPGR_OPSS Does Not Exist
Dropping STB schema… Processing command line …. Enter sysdba password:
Repository Creation Utility – Checking Prerequisites
Checking Global Prerequisites
Repository Creation Utility – Checking Prerequisites
Checking Component Prerequisites
Repository Creation Utility – Drop
Repository Drop in progress.
Percent Complete: 9
Percent Complete: 9
Percent Complete: 9
Percent Complete: 46
Percent Complete: 100
Repository Creation Utility: Drop – Completion Summary
Database details:
Connected As : sys
Prefix for (prefixable) Schema Owners : SYSMAN
RCU Logfile : /u01/oracle/em13/cfgtoollogs/cfgfw/emsecrepmgr.log
Component schemas dropped:
Component Status Logfile
Common Infrastructure Services Success /u01/oracle/em13/cfgtoollogs/cfgfw/stb.log

Repository Creation Utility – Drop : Operation Completed
Successfully dropped schema
Dropping MDS schema…
Enter sys user password :
Verify :
Action on MDS schema succeed.
Enter password for: sys process_id:1udv4t5atkf3v
Processing command line ….
Repository Creation Utility – Checking Prerequisites
Checking Global Prerequisites
Repository Creation Utility – Checking Prerequisites
Checking Component Prerequisites
Repository Creation Utility – Drop
Repository Drop in progress.
Repository Creation Utility: Drop – Completion Summary
Database details:
Connected As : sys
RCU Logfile : /u01/oracle/em13/sysman/log/schemamanager/m_080116_0845_AM/m_080116_0845_AM.DROP/rcu.log
Component schemas dropped:
Component Status Logfile
EM Repository Drop Success /u01/oracle/em13/sysman/log/schemamanager/m_080116_0845_AM/m_080116_0845_AM.DROP/em_repos_drop.log

Repository Creation Utility – Drop : Operation Completed
Performing PostDropAll action…
Enter password for: sys process_id:22ol5ee7e7oc
drop completed successfully
return value is : 0
The command executed is /u01/oracle/em13/oui/bin/
Launcher log file is /tmp/OraInstall2016-08-01_09-18-50AM/launcher2016-08-01_09-18-50AM.log.
Starting Oracle Universal Installer

Checking swap space: must be greater than 500 MB. Actual 15455 MB Passed
Checking if this platform requires a 64-bit JVM. Actual 64 Passed (64-bit not required)
‘detachHome’ was successful.
Logs successfully copied to /u01/oraInventory/logs.
return value is : 0

Deleting the instance home
Deleting the em home
The deinstallation of OMS is successful.

The location of the file is : /etc/oragchomelist




Viva Las Vegas – Collaborate 2016

It’s that time of year again, time to spend a few days in Las Vegas at Collaborate with some of the top Oracle experts and exercise your brain (and your feet)! I will be heading out in a few days and have a great schedule lined up!

Collaborate 16

This year will be a little different as I’m representing two products.   For a while now I’ve been working more on Oracle Management Cloud, which is our new cloud services for Log Analytics, IT Analytics and Application Performance Monitoring.   These services are complimentary to your existing Oracle Enterprise Manager solution.  From a DBA perspective, I’m especially excited about IT Analytics, as it’s an answer to many of the questions we get asked about long-term performance, capacity and resource data that Oracle Enterprise Manager collects.  We will have a demo booth, as well as a session on Monday at 10:30 in Palm B – Oracle Management Cloud: Next-Generation Monitoring, Management and Analytics.   Be sure to attend this session to hear about what Oracle Management Cloud has to offer!   You can also learn more about the Oracle Management Cloud services here.

As always, there’s a lot of excellent Oracle Enterprise Manager sessions this year!  If you’re in Las Vegas on Sunday morning, be sure to register for the Hands-on Lab: Everything I Needed to Know About Enterprise Manager I Learned at COLLABORATE.   We’ll start at 9am with an overview of EM 13c and work on new features in target properties, dynamic groups, creating gold agent images, tablespace corrective actions, and more!  You won’t want to miss out on this one!  Be sure to pre-register as space is limited.

Here’s where you’ll find me:

Sunday – 9-1 – Hands-on Lab: Everything I Needed to Know About Enterprise Manager I Learned at COLLABORATE

Monday – 10:30am Palm B – Oracle Management Cloud: Next-Generation Monitoring, Management and Analytics – Learn about the latest offering in IT Operations Management – Log Analytics, IT Analytics and Application Performance Monitoring.

Tuesday – 4:45pm Palm B – Building a High-Available Enterprise Manager system with Werner de Gruyter – This session is a must attend for anybody who needs to build or maintain a highly available Enterprise Manager.

Wednesday – 8:00am Palm B – Oracle Enterprise Manager Security: a Practitioners Guide – Rise and shine with my session on how to make Enterprise Manager security work for your company in more ways than one!

For a full list of Oracle Management Cloud and Oracle Enterprise Manager demos, labs, sessions and SIG meetings be sure to save or print this handy schedule!   Don’t worry, if you’re not heading to Las Vegas, you can still catch my session by registering for the IOUG Virtual Forum!

I will also be on the Oracle demo grounds either at the Oracle Enterprise Manager booth or Oracle Management Cloud booth.   If you follow me on twitter or read my blog and we haven’t met in person, stop by and say hi!

Engineering the Future of STEM with Fun

You don’t have to be a technologist to know what STEM is.  These days, you see STEM everywhere you turn around.  Science, Technology, Engineering and Math.  Some even add Art and make it STEAM.  Whatever you call it, it’s critical for our kids to advance in this technology focused world.  It doesn’t have to be all work though.  There are some great resources that make learning STEM more fun then you can imagine.

Our school is blessed with two amazing teachers, Noreen and Shelly, who are dedicated and excited about teaching young kids technology!   Once a week, they spend a few extra hours helping the kids in our Jr Robotics Club.  The club is open to kids 1st-3rd grade, boys and girls.  Yesterday, I took an hour out of my day to spend some time assisting.  It was so exciting to see how many young kids were excited and engaged with the robots they played with.   Especially as a woman in IT, it was exciting to see how many girls were involved as well!

Initially, they focused on the early LEGO robotics as they were gearing up to participate in the FIRST Lego League Jr showcase in December.  Our school had 4 teams create a “working LEGO project”.  They had to create and build something that moved using the LEGO motors that related to this years theme on waste and recycling.  My 7-year old son’s team, Tough LEGO Town, built a can recycling conveyor belt.   They take their creation, along with a poster that shows their ideas, to a competition where they are interviewed by volunteer judges.  As a parent it was thrilling to see them explain their creation and demonstrate it at such a young age.  Another critical skill they’re practicing is working in groups, this is something many adults still don’t do well!  Certainly nothing like what I was doing in 1st Grade!


Since the competition was over, and the kids still wanted to learn about robot and technology, the teachers have continued meeting once a week after school.  The group grew from 24 to about 40 once the word got out. They break the kids into groups and have different stations for each group to work on.  Here’s an example of what they’re using in our club to engage and excite:

Scratch and Scratch Jr

Using iPads or Laptops, the kids can code using Scratch.  If you haven’t checked out Scratch, you should!   Scratch was developed at MIT and is a great way to get kids using their creativity and thinking skills.  They can create stories, animations, games, music and art, just the way they want it.  You can start using Scratch on a PC or Mac at or you can download the Scratch Jr app on a tablet.   This is a great one to do at home too!  My kids love to share their creations with their friends!

LEGO WeDo 2.0

The LEGO WeDo is the beginner version of their Mindstorm robotics.  The build instructions are easy, and the software you use to program is very simple!   The newest release just came out in January (too late for Christmas unfortunately).  WeDo 2.0 is geared towards 2nd through 4th grade, though with parent help Kinder and 1st will also enjoy!  The core kit comes with 280 bricks, a SmartHub power block, 2 motors, a motion sensor, storage tray, programming software and examples!  You can see more in the video below.   WeDo 2.0 is available for $159.95 from LEGO Education here, this is definitely on my Christmas list.   We’re currently using the original WeDo sets, but the 2.0 looks like so much more fun!

If you have older kids, 4th and up, you might also want to look at the LEGO Mindstorm EV3.   My 4th grader competes in the regional FIRST Lego League competition for Houston this weekend, and they use the EV3 robot.  You can read more about their experience with FLL here.

Dash & Dot

Wonder Workshop’s Dash & Dot are controlled by an app on your phone or tablet.  You can program them using a visual block based app, or you can play one of the games that are available.  Along with movements, sounds and lights, you can purchase accessories such as a Xylophone that Dash can play.


Sphero is an app controlled robot that allows you to program with visual blocks, then see the actual c-based code that you wrote.  Our school has the Sphero SPRK Edition, which is fun since you can see what’s inside and how it works.

Family STEM Time

While all of these robots and programs are very useful in schools to aid in their STEM curriculum, we shouldn’t just limit technology teaching to school.   These robots and games make learning fun, and it’s just as easy to have fun at home with mom and dad!   They make excellent birthday and Christmas gifts for the kids who have everything!  Grandparents love to buy educational items, instead of just video games.  Take some time, invest in the future of tech, and watch your child’s face light up with joy!


EM 13c – Enhancements to DB Optimizer Statistics Console

Optimizer Statistics Console

There’s a lot of new features in Enterprise Manager 13c (EM 13c) that the DBAs will love.   One of the things I’d heard customers complain about was the lack of visibility into the statistics jobs that run in the target database using DBMS_SCHEDULER. The enhanced DB Optimizer Statistics Console provides a central place to manage database statistics, view a summary of all object status, and the status and performance of the statistics jobs that have run.   To access the console, from a database target select Performance / SQL / Optimizer Statistics.

The Operations, Configure and Status sections haven’t changed much.  In the Operations section you can Gather, Lock, Unlock, View, Restore and Delete statistics.  From the Configure section you can easily adjust Global Statistics Gathering Options, Object Level or make changes to the Auto Task and view SPA Validation Resultsem13db3

The new notable new features here are the breakdown of Statistics Gathering tasks and the Jobs List.  From the chart, you can quickly see if you have job failures that might be affecting your performance.  In the status chart, you can identify if you have State statistics that need attention.


Drilling down into the Statistics Gathering Job List or Auto Tasks will bring up a detailed report of the job run.


Not a huge change, but hopefully one that will make the DBA’s life easier and provide better insight when evaluating database statistics!


EM 13c – Exciting Updates to Target Properties

It’s here, it’s finally here!  I know most of you have already downloaded the binaries and started installing or upgrading your test environment.   It’s just too tempting not to, right?  One question I’ve heard over and over since Oracle Enterprise Manager 12c came out… Can I use User Defined Target Properties in my Dynamic Group and Administration Group?  Sadly the answer has always been no.   Until now.  Now, the answer is proudly YES!

User Defined Target Properties

One of the small but powerful new features in EM 13c is the ability to use your custom target properties to define the Dynamic and Administration groups!  This will work with global target properties, the ones you set as target_type=”*”.  The target specific properties won’t show up in the select list.  Small compromise I think!

First, create your custom target property with emcli command.

$ ./emcli add_target_property -target_type=”*” -property=”Owner”
Property “Owner” added successfully

Next, create a Dynamic Group and select the Define Membership Criteria button.


You’ll see a list of the default target properties. Click the Add/Remove Target Properties button.

target properties

In this list, you will now see the Owner target property that I created earlier.  Select the box and click Select
target properties

Now, you need to set which values of this property you want to be added to this group by clicking the magnifying glass next to Owner.


Since this is Jill’s group, we’re going to select Jill, click Move and then Select.

target properties

Now we see, that this group is going to contain any targets owned by Jill.


Final step is to review membership and click OK.

target properties

Now that the group has been created, if Jill own’s any targets, we’ll see them listed in her group.

target properties

You will also see the global target property in the selection for the Administration Groups as shown here:

Administration Group with User Defined Target Property


Target Property List of Values

Another big enhancement is the ability to create a list of values to more accurately store your target properties.  Say your Line of Business has DBA, MW, and App.   However, admins keep entering the wrong values.  These won’t get used in Dynamic or Administration groups because the values were not expected.

To enable a Target Property to use a Master List of Values:

$ ./emcli use_target_properties_master_list -property_name=orcl_gtp_location -enable

Targets exists with values set for this property. Run the same command with -copy_from_targets flag to copy all values to the master list.

If your targets are already using this property, you’ll get the error message above.  Update your emcli command to include the -copy_from_targets flag.
$ ./emcli use_target_properties_master_list -property_name=orcl_gtp_location -enable -copy_from_targets
Successfully migrated property values

To see the target properties, on any target go to the target menu, then Target Setup / Properties.  Click Edit to update properties.

target properties

As you can see, there are no values listed for Location target property.

target properties

$ ./emcli add_to_target_properties_master_list -property_name=”orcl_gtp_location” -property_value=”Houston” -property_value=”Austin”
Successfully set 2 value(s) for property: orcl_gtp_location

Now under the edit Target Properties you’ll find the correct values listed:

target properties

If you added the wrong value, or you need to remove a value, you use the delete_from_target_properties_master_list command:

$ ./emcli delete_from_target_properties_master_list -property_name=”orcl_gtp_location” -property_value=”Houston, Austin”
Successfully deleted property-value

To see the valid values, you can use the list_target_properties_master_list_values command.

$ ./emcli list_target_properties_master_list_values -property_name=orcl_gtp_location
Target Properties Master list of values for property : orcl_gtp_location


For more on what you can do with Target Properties, you can see my previous post here.   I think with these two enhancements to target properties, EM administrators everywhere will smile a little brighter tonight.  Enjoy!


Enterprise Manager 13c – What’s New and What You Should Know!

Enterprise Manager 13c – What’s new and What You Should Know!

You may have heard by now that Enterprise Manager (EM) 13c has been  released and you can’t wait to get your hands on the new updates and see what it’s all about!   Here’s a few things you should know!

What’s With the New UI?

Just when you got used to the changes in EM 12c, they go and change it all up again right?  Well, change can be hard, but in this case, I think you’ll appreciate a few things.   The overall theme has been updated to match that of other Oracle Products.  Focus is on presentation of the data, so more charts, callouts of number of incidents/errors and tiles that help you get to important information quickly.

Enterprise Manager 13c

Where did my menu’s go?

Instead of a menu on the left for general use, and admin features on the right, the menu bar has been consolidated to the right side of the page.   Once everyone gets used to this, I think that it’s going to be a lot easier to navigate and more consistent.  Even I had a hard time remembering, was it on the left menu or the right?  This will take a little to get used to but I think it’s much more user friendly.


From left to right:

  • em13enterprise (Enterprise) – Monitoring, Jobs, Reports, Patching, etc.
  • em13target(Target) – All Targets, Databases, Exadata, etc.
  • em13favorites(Favorites) – Save your favorite targets for quick access
  • em13history(History) – View your last 10 pages
  • em13setupSetup – Security, incident rules, add targets, etc.

Next you’ll see the Search icon, when you click on the magnifying glass, a search box will appear.  This can be used to search for any target.


The em13notification is new, this will take you to the Notification Center, one of the new features we’ll discuss more later.

Finally you have the User Menu em13user where you’ll find user preferences and logout.

What versions of WebLogic and Java are installed?

The infrastructure stack for WebLogic is now, and Java is version 1.7.0.    Both are deployed during the Enterprise Manager install.

What’s the upgrade path?

Direct upgrade from Enterprise Manager and higher will be supported.  The Repository database needs to be upgraded to DB first.

If you’re still on EM 10g or 11g (we need to talk), you need to upgrade to EM first before you can upgrade to 13c.

Is there a 2-system upgrade?

No, unfortunately, there is no 2-system upgrade so a full downtime will be required.

Can I use my 12c Agents?

Yes, if you’re agents are or higher, they are compatible with 13.1.  However, you should plan to upgrade the Agents as soon as possible to take advantage of new features.

A Few New Features

Always On Monitoring Service – A separate service used in planned downtime, will receive target availability and send limited notifications.

Agent Gold Images – Create a standard agent and mass deploy updates with ease.  This will cover provisioning, upgrading and updating agents.    I will be posting about this separately, in detail.

Corrective Actions – Support for corrective actions on all event types (beyond just metrics), and out-of-the box customizable Tablespace corrective action.

Target Properties Master List of Values – Define a list of values for your target properties.

Incident Manager  –  New dashboards and export incident rules. Enough said.

That’s a quick summary, but I’ll be blogging about more features in detail.  In the meantime, takes sometime to review the New Features documentation.




Kids and LEGO and Robotics, Oh My!

Kids and LEGO and Robotics, Oh My!

If you know me well,  you know Oracle is my secondary passion.  My true passion is my children.  I have two amazing, beautiful, intelligent and active boys, ages 7 and 9 (no bias there, I know).   Life is never dull, that’s for sure!  We are fortunate to live in a area with great schools, with teachers and administration that realize the importance of Science, Technology, Engineering and Math (STEM) education.   We have a very active PTO that goes to amazing efforts to raise money so that the school can get the latest technology and programs to help our kids succeed.   The whole school participates in the Hour of Code each year, and recently started a Python programming club.  Kids actually come to school early to program!  They also recently added robotics teams for 1st-6th, and this year my 4th and 1st grader are both participating.

I didn’t know much about the FIRST organization before now.  They’ve done some Jr. Engineering and Jr. Robotics camps, most recently an excellent one by Woodlands Robotics that they really enjoyed.  This led to their interest in participating with the school teams.  My oldest is in FIRST LEGO League (FLL), where my youngest is doing FIRST LEGO League Jr for kids K-3rd grade.   Last weekend the FLL team had their first competition, and there’s only really one way to describe it.  Electric.


This program is for students in 4-8th grade.  Each year there’s a new challenge.   This year it’s about trash, and how we can reduce, reuse and recycle.   The teams have missions they have to solve with their robot, all worth a certain number of points. They need to work together to decide the best use of their time, and how to program the robot to complete as many missions as possible in 2 1/2 minutes.

They are also judged on their core values (discovery, teamwork, gracious professionalism), robot design, and their project.  The project is a way for them to show their creativity in how to solve this years challenge.   Our team decided that our school could reduce trash by using sporks instead of spoons and forks, and recycling in the cafeteria.  The also wrote a rap and performed this for the judges.   The team has a sponsor, but all the work is done by the kids themselves.

For the robot, they use the LEGO Mindstorm EV3 robot.  If you haven’t checked this out, you should.   Every techie could fall in love with this.  You can build various robot models, with sensors and arms.  Then you connect it to your computer and program using a visual block structure.   Viola! You’ve got a robot that can play music, move pieces, turn wheels and anything else you can think of!  The programming is very flexible, and easy enough once you understand what the components do.

Competition Day

Our team is called 4G Short Circuits.  These 6 boys and 1 girl have been meeting 4 hours a week after school since early October.  They stepped it up in the last 2-3 weeks adding every day before school for 45 minutes, working through lunch, and extra after school hours.   For a while I was a little concerned with how much time he was having to spend on this, he is only 9 after all.   With 7 incredibly intelligent kids, one of the things they’re learning  is to work together as a team, and communicate effectively.  When everybody has a great idea, it’s sometimes hard to get them all out for discussion.   Once they work out the kinks here, I think we can expect to see amazing things!  I know adults who have a hard time with this concept, so it’s exciting to see the kids working on it so young.

We arrived at the competition before 8am on a Saturday.  The kids got in a few practice rounds, some hands on experience and advice from the judges, and had a chance to fine tune their programs a bit.  It was amazing to see them just go to work and know exactly what they needed to do.   Then the kids go in front of a panel of judges for design, project and core values.

Solving problems after first test run.

Finally, the robot games started.  There’s three rounds and they keep their highest score.  Their first round, things didn’t go so well.  The robot went crooked, missed it’s mark, and stalled.  You could see the disappointment on their faces.  However when the scores were posted, they were still positive and went straight to work on fixing the problems they had.  They went into the second round knowing that their final program wasn’t quite right but they’d made some changes and hoped for the best.   When they hit 4 out of 5 missions, you could see the pure joy and accomplishment in their faces, and when they were ranked in the top 10, they were beyond themselves!   The third round didn’t go as well, and they slipped to 11th place, but these kids remained positive and upbeat the whole time, and that is what I’m so proud of.  Their first time as a team, first robotics competition and they came in 11th out of 42 teams!

Testing out their robot before competition starts.

They worked so hard to get to this point, just like any sports team when playing a tournament.   Although they didn’t win an award, they did earn their golden ticket to the regional competition in February.    These 4th graders worked together, worked hard, challenged themselves in areas they weren’t familiar with, and kept pushing until the very end.  They never game up, never game in.  They were electric.  And as a parent in tech, it was so exciting to see the sparks!  They’ve got a break for now, but in January they’ll be back in the lab working together to fix those problems and see how many more points they can earn for the next competition!

4G Short Circuits celebrating their golden ticket

One thing about the competition that stood out, was how diverse the participants were.  All ages, races, genders, it was really fun to see so many of the young girls involved with the robotics and programming.   The idea behind FIRST was a way to get more kids interested in STEM, by leveraging the sports competition model.  I think it’s working!  I think back to what I was doing in 4th grade, and it sure doesn’t compare to this!

Texas Torque is the high school team who sponsors the FLL competition for our district.  I have to say, this was the nicest group of high school kids I’ve ever met.   They won the FIRST Robotics Competition World Championship in 2013 and were featured in the documentary Roboleague.   It’s an amazing look at how the FIRST league got started, and how bright and amazing these kids are.   So, if you have bright, creative, talented children who like to use their mind, check out and see if there’s an event in your area, or consider forming a team!

See my Kids & Code page for more resources!