Tuesday, August 22, 2023

Schema level privilege grants

Many modern applications separate the data owning schema from application service / run-time account used to access the data. This provides for separation of duty and supports the least privileged model and lowers the risk if the accessing account is compromised. But there is only one difficulty, how to manage the list of privileges for that run time account. Further as the application schema changes over the time, how do we keep up the list current? prior to Oracle 23c, developers had only two choices
 
  • Grant individual privilege on each table and view in the application schema.
  • Grant “ANY” privileges.
 
The first choice is inconvenient, because we need to identity every single table / view and do individual grant to every run time or service user. We can develop a script, but it is inconvenient. It is also a sub optimal way to deal with application schema changes such as adding new tables or views, because we must now remember to make corresponding privilege grants.
 
The second choice of granting ANY privilege while convenient is sub-optimal from security angle, because with that grant you gave the user the ability to select from any table in the database. If that user account is compromised, then your entire database is exposed.
 
To address these problems, Oracle 23c introduced a new SCHEMA level grants – that is if you say GRANT SELECT ANY TABLE ON SCHEMA SCOTT TO DEMO, then the user DEMO could see all tables / views in SCOTT schema, if any new table got created in SCOTT schema then DEMO user instantly has access to that new table. No additional management is needed, and we can continue to support the least privilege security model with appropriate separation of duties.
 
First let’s create users and few tables to test.
 
demo@FREEPDB1> grant create session,create table, unlimited tablespace
  2  to app_schema
  3  identified by "app_schema";
 
Grant succeeded.
 
demo@FREEPDB1> grant create session
  2  to app_user
  3  identified by "app_user";
 
Grant succeeded.
 
demo@FREEPDB1> conn app_schema/app_schema@freepdb1
Connected.
app_schema@FREEPDB1> create table t1( name varchar2(20) );
 
Table created.
 
app_schema@FREEPDB1> insert  into t1 values ('Tom'),('Kyte');
 
2 rows created.
 
app_schema@FREEPDB1>
app_schema@FREEPDB1> create table t2( city varchar2(20) );
 
Table created.
 
app_schema@FREEPDB1> insert  into t2 values ('Minnesota'),('Chicago');
 
2 rows created.
 
app_schema@FREEPDB1> commit;
 
Commit complete.
 
 
Now connect as APP_USER and verify if you can see the tables in APP_SCHEMA user account.
 
app_user@FREEPDB1> select * from session_schema_privs;
 
no rows selected
 
app_user@FREEPDB1> select * from app_schema.t1;
select * from app_schema.t1
                         *
ERROR at line 1:
ORA-00942: table or view does not exist
 
 
app_user@FREEPDB1> select * from app_schema.t2;
select * from app_schema.t2
                         *
ERROR at line 1:
ORA-00942: table or view does not exist
 
APP_USER could not select from APP_SCHEMA tables, because the user has no privileges on the objects / schema. Now lets grant schema privilege to APP_USER, so that APP_USER will get required permissions to view the data in APP_SCHEMA tables.
 
app_user@FREEPDB1> conn app_schema/app_schema@freepdb1
Connected.
app_schema@FREEPDB1> grant select any table on schema app_schema to app_user;
 
Grant succeeded.
 
Post the schema privilege grants the APP_USER can now be able to see all the data in APP_SCHEMA.
 
app_schema@FREEPDB1> conn app_user/app_user@freepdb1
Connected.
app_user@FREEPDB1> select * from app_schema.t1;
 
NAME
------------------------------
Tom
Kyte
 
app_user@FREEPDB1> select * from app_schema.t2;
 
CITY
--------------------
Minnesota
Chicago
 
app_user@FREEPDB1> select * from session_schema_privs;
 
PRIVILEGE                                SCHEMA
---------------------------------------- ---------------
SELECT ANY TABLE                         APP_SCHEMA
 
Here comes the good part, when APP_SCHEMA add a new table APP_USER should automatically have access to new tables.
 
app_user@FREEPDB1> conn app_schema/app_schema@freepdb1
Connected.
app_schema@FREEPDB1> create table t3( country varchar2(20) );
 
Table created.
 
app_schema@FREEPDB1> insert  into t3 values ('USA'),('UK');
 
2 rows created.
 
app_schema@FREEPDB1> commit;
 
Commit complete.
 
app_schema@FREEPDB1> conn app_user/app_user@freepdb1
Connected.
app_user@FREEPDB1> select * from app_schema.t3;
 
COUNTRY
--------------------
USA
UK
 
app_user@FREEPDB1>
 
As you have seen, the new schema-level privileges make it easy to grant access to ALL of an applications data and objects, and as new objects are added to the schema there is no need to update grants for those new objects.
 
 
 

Tuesday, August 15, 2023

Sysdate Vs Current date in Autonomous database

 
SYSDATE is sql function that returns the current date and time for the database server Operating system, where as CURRRENT_DATE returns the current date and time in the session time zone, the same applies to SYSTIMESTAMP and CURRENT_TIMESTAMP also.
 
When both database and application server are set in the same Time zone, both functions return the same value. Now what happens if we migrate database to a cloud platform (server in a different timezone), say Autonomous database that goes with default timezone as UTC ?
 
Now, if application is using SYSDATE or SYSTIMESTAMP, but expect to get a local time, then we are in trouble.
 
If you are expecting to get the database client’s local time, always use CURRENT_DATE and CURRENT_TIMESTAMP. This makes your application independent of in what time zone the database server runs. However, for existing applications, this means changing the application code is required
 
However Oracle Autonomous database, provides a workaround, to sync the session timezone with the database time zone, that is CURRENT_DATE and CURRENT_TIMESTAMP return the database time zone values by setting the session parameter time_zone to dbtimezone.
 
 
demo@ATP19C> select sysdate, current_date from dual;
 
SYSDATE                 CURRENT_DATE
----------------------- -----------------------
27-JUL-2023 03:43:41 pm 27-JUL-2023 09:13:41 pm
 
demo@ATP19C> alter session set time_zone = dbtimezone;
 
Session altered.
 
demo@ATP19C> select sysdate, current_date from dual;
 
SYSDATE                 CURRENT_DATE
----------------------- -----------------------
27-JUL-2023 03:43:51 pm 27-JUL-2023 03:43:51 pm
 
demo@ATP19C>
 
 
 

Tuesday, August 8, 2023

JSON Schema - Part IV

Having discussed enough about the JSON Schema in the previous blog series ( Part I, Part II, Part III ), Another common use of JSON Schema is used to describe an object, lets move on to an example of using it. Here is my EMP table that got very conventional datatypes and no JSON datatypes.
 
demo@FREEPDB1> desc emp
 Name          Null?    Type
 ------------- -------- -------------
 EMPNO         NOT NULL NUMBER(4)
 ENAME                  VARCHAR2(10)
 JOB                    VARCHAR2(9)
 MGR                    NUMBER(4)
 HIREDATE               DATE
 SAL                    NUMBER(7,2)
 COMM                   NUMBER(7,2)
 DEPTNO        NOT NULL NUMBER(2)
 
To get the JSON schema for the above table, all we need to do is make use of “describe” method in DBMS_JSON_SCHEMA package.
 
demo@FREEPDB1> select dbms_json_schema.describe('EMP') x;
 
X
--------------------------------------------------------------------------------
{
  "title" : "EMP",
  "dbObject" : "DEMO.EMP",
  "type" : "object",
  "dbObjectType" : "table",
  "properties" :
  {
    "EMPNO" :
    {
      "extendedType" : "integer",
      "sqlPrecision" : 4
    },
    "ENAME" :
    {
      "extendedType" :
      [
        "null",
        "string"
      ],
      "maxLength" : 10
    },
    "JOB" :
    {
      "extendedType" :
      [
        "null",
        "string"
      ],
      "maxLength" : 9
    },
    "MGR" :
    {
      "extendedType" :
      [
        "null",
        "integer"
      ],
      "sqlPrecision" : 4
    },
    "HIREDATE" :
    {
      "extendedType" :
      [
        "null",
        "date"
      ]
    },
    "SAL" :
    {
      "extendedType" :
      [
        "null",
        "number"
      ],
      "sqlPrecision" : 7,
      "sqlScale" : 2
    },
    "COMM" :
    {
      "extendedType" :
      [
        "null",
        "number"
      ],
      "sqlPrecision" : 7,
      "sqlScale" : 2
    },
    "DEPTNO" :
    {
      "extendedType" : "integer",
      "sqlPrecision" : 2
    }
  },
  "required" :
  [
    "EMPNO",
    "DEPTNO"
  ],
  "dbPrimaryKey" :
  [
    "EMPNO"
  ],
  "dbForeignKey" :
  [
    {
      "DEPTNO" :
      {
        "dbObject" : "DEMO.DEPT",
        "dbColumn" : "DEPTNO"
      }
    }
  ]
}
 
 
The above JSON Schema shows the object EMP is of database object type TABLE belonging to the schema DEMO, from a JSON Schema perspective it is an object, but from the relational database object perspective it is table, the dbObjectType can be different based on object type we describe, it can be a type/table/view/dualityview etc.
 
The properties in the JSON Schema maps to the column in the table we described. The EMPNO is actually an integer datatype, where as ENAME is string datatype but it can also be null, similarly all the required columns of the table (column(s) having NOT NULL constraint) are mapped to the required properties of the JSON Schema, similar to that all the key columns in the tables are mapped to dbPrimaryKey and dbForeignKey properties of the JSON Schema
 
Where does this JSON Schema really help us? It is really helpful to validate (both the structure and data typing of JSON documents) the incoming JSON documents for persistence and finally when we generate the JSON for the table EMP and if we need an API description of how the data looks like then this JSON Schema really helps us.
  
Similarly, we can make use of plsql object type for JSON Schema representation.
 
demo@FREEPDB1> create or replace type address_type
  2  as object( street varchar2(30),
  3     city varchar2(20),
  4     state varchar2(2),
  5     postal_code varchar2(6) );
  6  /
 
Type created.
 
The JSON Schema for the plsql object type is possible using.
 
demo@FREEPDB1> select json_serialize( dbms_json_schema.describe('ADDRESS_TYPE') pretty ) ;
 
JSON_SERIALIZE(DBMS_JSON_SCHEMA.DESCRIBE('ADDRESS_TYPE')PRETTY)
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
{
  "title" : "ADDRESS_TYPE",
  "dbObject" : "DEMO.ADDRESS_TYPE",
  "type" : "object",
  "dbObjectType" : "type",
  "properties" :
  {
    "STREET" :
    {
      "extendedType" :
      [
        "null",
        "string"
      ],
      "maxLength" : 30
    },
    "CITY" :
    {
      "extendedType" :
      [
        "null",
        "string"
      ],
      "maxLength" : 20
    },
    "STATE" :
    {
      "extendedType" :
      [
        "null",
        "string"
      ],
      "maxLength" : 2
    },
    "POSTAL_CODE" :
    {
      "extendedType" :
      [
        "null",
        "string"
      ],
      "maxLength" : 6
    }
  }
}
 
 
The attributes of the object type maps to properties of JSON Schema document. So we can easily describe types/tables/view/dualityviews to get the JSON Schema representation of it. 
 
 

Tuesday, August 1, 2023

JSON-Schema - Part III

This will be the continuation of previous blog posts, where we discussed about JSON Schema in two parts (Part I and Part II)
 
Let’s looks again little bit more on this domain, a shorter domain
 
demo@FREEPDB1> create domain if not exists employeeD as json validate '{
  2     "type":"object",
  3     "properties":{
  4             "name":{"type":"string","minLength":2} ,
  5             "salary":{"type":"number","minimum":800} ,
  6             "email":{"type":"string"} ,
  7             "joinDate":{"extendedType":"date"} } ,
  8     "required":["name","salary","joinDate"]
  9  }' ;
 
Operation 281 succeeded.
 
That tell us that the schema used to validate the employee domain is an object, that got three fields “name” that is of string datatype and the minimum length of the string is at least two characters, and the salary filed is numeric datatype and can’t be less than 800 and got an email and the date on which the employee got hired. The field “joinDate” got extendedtype as “date” that maps to the SQL Datatype since “date” datatype is missing in JSON, finally any json documented to be persisted successfully require the keys “name” , “salary” and “joinDate” where as the “email” is the optional
 
Then we associate the above domain to a table.
 
demo@FREEPDB1> create table if not exists
  2  employees( id number, data json domain employeeD );
 
Table created.
 
When we insert the data, it fails
 
demo@FREEPDB1> insert into employees( id,data )
  2  values(1, json('{"name":"john",
  3     "salary":2000,
  4     "joinDate":"2023-05-11T17:04:54+0000"}') );
insert into employees( id,data )
            *
ERROR at line 1:
ORA-40875: JSON schema validation error
 
How do we find out why this data is invalid? this is where the DBMS_JSON_SCHEMA comes in handy, It helps us to analyse the given JSON schema and JSON data and tell us if the document is valid or not and if it is not valid it will list down the errors associated with the documents.
 
demo@FREEPDB1> declare
  2     l_data json ;
  3     l_schema json;
  4     l_result boolean;
  5     l_error json;
  6  begin
  7     l_data := json('{"name":"john",
  8     "salary":2000,
  9     "joinDate":"2023-05-11T17:04:54+0000"}') ;
 10
 11     l_schema := json('{
 12     "type":"object",
 13     "properties":{
 14             "name":{"type":"string","minLength":2} ,
 15             "salary":{"type":"number","minimum":800} ,
 16             "email":{"type":"string"} ,
 17             "joinDate":{"extendedType":"date"} } ,
 18     "required":["name","salary","joinDate"]
 19     }') ;
 20
 21     dbms_json_schema.is_valid( l_data, l_schema, l_result, l_error );
 22     if l_result then
 23             dbms_output.put_line(' Valid data' );
 24     else
 25             dbms_output.put_line( json_serialize( l_error returning clob pretty) );
 26     end if;
 27  end;
 28  /
{
  "valid" : false,
  "errors" :
  [
    {
      "schemaPath" : "$",
      "instancePath" : "$",
      "code" : "JZN-00501",
      "error" : "JSON schema validation failed"
    },
 
    {
      "schemaPath" : "$.properties",
      "instancePath" : "$",
      "code" : "JZN-00514",
      "error" : "invalid properties: 'joinDate'"
    },
    {
      "schemaPath" :
 "$.properties.joinDate.extendedType",
      "instancePath" : "$.joinDate",
      "code" : "JZN-00503",
      "error" : "invalid type found, actual: string, expected: date"
    }
 
]
}
 
PL/SQL procedure successfully completed.
 
To resolve the above highlighed error, how do we express a “date” in JSON document if JSON doesn’t to have a DATE datatype.  One way is to use the JSON constructor in “extended” mode, which means look for the annotations $oracleDate and treat its value (2023-05-11T17:04:54+0000) not as string instead the annotated value (in this case it is “date”)
 
demo@FREEPDB1> insert into employees( id,data )
  2  values(1, json('{"name":"john",
  3     "salary":2000,
  4     "joinDate":{"$oracleDate":"2023-05-11T17:04:54+0000"} }' extended) );
 
1 row created.
 
Now we can validate the datatype from the data using type() methods.
 
demo@FREEPDB1> select e.data.joinDate jd,
  2     e.data.joinDate.date(),
  3     e.data.joinDate.type() ty
  4  from employees e;
 
JD                             E.DATA.JOIN TY
------------------------------ ----------- ----------
2023-05-11T17:04:54            11-MAY-2023 date
 
Another option would be to do in SQL Domain using CAST option
 
demo@FREEPDB1> create domain if not exists employeeD_CAST as json validate CAST '{
  2     "type":"object",
  3     "properties":{
  4             "name":{"type":"string","minLength":2} ,
  5             "salary":{"type":"number","minimum":800} ,
  6             "email":{"type":"string"} ,
  7             "joinDate":{"extendedType":"date"} } ,
  8     "required":["name","salary","joinDate"]
  9  }' ;
 
Operation 281 succeeded.
 
The only difference is the keyword CAST highlighted above. Which means using this JSON Schema not only to validate but also to type CAST data on insert and validate it.
 
Let’s validate the effect of this domain.
 
demo@FREEPDB1> create table if not exists
  2  employees2( id number, data json domain employeeD_CAST );
 
Table created.
 
demo@FREEPDB1>
 
the same statement which used to fail previously was succeed now, because this ISO 8601 data string got casted into a date and got inserted.
 
demo@FREEPDB1> insert into employees2( id,data )
  2  values(1, '{"name":"john",
  3     "salary":2000,
  4     "joinDate":"2023-05-11T17:04:54+0000"}' );
 
1 row created.
 
demo@FREEPDB1> select e.data.joinDate jd,
  2     e.data.joinDate.date(),
  3     e.data.joinDate.type() ty
  4  from employees2 e;
 
JD                             E.DATA.JOIN TY
------------------------------ ----------- ----------
2023-05-11T00:00:00            11-MAY-2023 date
 
But when you put some invalidate string that can’t be casted to date will error like this
 
demo@FREEPDB1> insert into employees2( id,data )
  2  values(2, '{"name":"john",
  3     "salary":2000,
  4     "joinDate":"Hi"}' );
insert into employees2( id,data )
            *
ERROR at line 1:
ORA-40875: JSON schema validation error
JZN-00066: invalid cast operation requested