Does DB2 database have feature to add Sensitive Data Indicator to an Object? - db2

SQL Server gives a feature to add Sensitive Indicator for Columns/Objects to identify what kind of data is store in that Column.
CREATE TABLE STUDENT (SNAME VARCHAR(1000))
ADD SENSITIVITY CLASSIFICATION TO
dbo.STUDENT.SNAME
WITH ( LABEL='Highly Confidential', INFORMATION_TYPE='Financial', RANK=CRITICAL )
Then we can fetch this Information with the following query.
SELECT *FROM sys.sensitivity_classifications
Does DB2 have any feature similar to this?
SQLServer Documentation : SQLServer_Documention_For_Sensitive_Data_Indicator

Db2 has the security feature of Label-Based Access Control (LBAC). You can define and later assign security labels and policies to data. Moreover, you then define access control rules based on those labels.
INSERT INTO student VALUES ('Henrik', SECLABEL_BY_NAME('Highly Confidential', 'Financial') )

Related

How to use security policy packages on Databricks

I am trying to create data security policies on user tables on Databricks. However i have implemented this task on SQL server with below SQL Queries
CREATE FUNCTION [test].[mailfunction](#useremail AS nvarchar(100))
RETURNS TABLE WITH SCHEMABINDING AS
RETURN SELECT 1 AS mailfunction_result WHERE #useremail = SUSER_SNAME()
GO
create SECURITY POLICY [mailfunctionSecurityPolicy]
ADD FILTER PREDICATE [test].[mailfunction]([useremail]) ON
test.users WITH (STATE = OFF);
And i am trying this to implement on Databrick and created the function but i am not able to create SECURITY POLICY on Databricks.
I need to create the function or work around for Create function in databricks and need to archive role base access control on my table as we achieved in SQL side.
Also please suggest some reference code for implement Role based access and Row and Column level security and data masking implementation databricks.
Right now there is no exact the same functionality but it's coming in the near future - you can watch latest Databricks quarterly roadmap webinar to get more details about upcoming functionality for RBAC & ABAC.
But right now you can dynamic views over the tables to implement row-level access control and data masking. For this you can use current_user and is_member functions to perform checks. Like this (example from docs):
CREATE VIEW sales_redacted AS
SELECT user_id,
CASE WHEN
is_member('auditors') THEN email
ELSE 'REDACTED'
END AS email,
country, product, total
FROM sales_raw
And you can use user/group names from the data itself, it's not necessary to use hard-coded group names in the is_member call. You can see example in the following answer.

How to filter prometheus series based on the results of another query in grafana dashboard?

I am using Grafana 9.3.1 for monitoring of our system. Among other things, I am trying to monitor the remaining FUP of a phone number for each unit we operate.
Basically, we intend to use two data sources.
Database mapping of the unit ID to its phone number (e.g. "unit_id=123, phone_number="00 123456789")
Prometheus time series remaining_fup{phone_number="00 123456789"}. However, remaining_fup is a 3rd party data and does not include unit_id.
In my unit-detail dashboard I have unit_id variable which indicates which unit FUP should be displayed (among other things depending on unit_id)
My original approach was this:
Create a mixed datasource dashboard
Add database datasource as data A. SELECT phone_number FROM units WHERE unit_id='$unit_id'
Add prometheus datasource remaining_fup and filter it based on A.phone_number: remaining_fup{phone_number="${A.phone_number}"}
Unfortunatelly such use of A isn't supported. I used to hope for applying some transformation like Merge or Join by field and then Filter but with no success. After a lot of googling and trying I feel hopeless.
Could you help please? Is such filter even possible? Thanks!
TL;DR: In grafana dashboard I want to query one datasource in order to obtain a value which I subsequently want to use in another datasource query.
1.) Create variable - name phone_number, type: Query and query your database datasource SELECT phone_number FROM units WHERE unit_id='$unit_id'. You can hide this variable if you don't want it to be visible for the dashboard users.
2.) Variable phone_number may have multiple values, so use advance variable formatting to create valid regex query syntax for your prometheus datasource, e.g.
remaining_fup{phone_number=~"${phone_number:pipe}"}
Of course this queries are just examples and they may need some (syntax) tweaking for the use case. Main idea: don't use 2 queries, but one variable and one query (where you use that variable).

IBM DB2 Timetravel logging based on some criteria

I have been searching for the condition, where, lets say when we enable time travel to a certain table in DB2 , but don't want to capture all the updates done, but only the updates that's done by some specific user.
Wanted to know if this is at all possible with the DB2 time travel and how we can achieve it .
It's not possible with DB2 temporal tables.
Alter the temporal table add a user column maintained by system.
db2 for Iseries column shown
EMP_CHANGE_USER VARCHAR(18) GENERATED ALWAYS AS (USER)
The new column will go automatically to the history table of the temporal table. You can report on the history table and have emp_change user.
Note: IRL Don't single out users. You can give management a report that lists out all users and management can filter it down to individuals. Programmers do not single out users for reporting and logging.

Determine identity generation in DB2

In DB2 9.7, when you CREATE a table and specify a column with identity. You can specify the method in which it will handle the generation. You have two choices GENERATED ALWAYS or GENERATED BY DEFAULT.
Once the table has been created how can you tell which method it's using (without performing an insert)?
From the Control Center, I tried to generate the DDL to see how it would generate it, but I'm not certain this is accurate.
You can find all schema-related information in the DB2 system catalog. The view SYSCAT.COLUMNS holds the core data about columns and their properties. To determine whether a column is GENERATED ALWAYS or GENERATED BY DEFAULT look at the column GENERATED in that column.

How to access Library, File, and Field descriptions in DB2?

I would like to write a query that uses the IBM DB2 system tables (ex. SYSIBM) to pull a query that exports the following:
LIBRARY_NAME, LIBRARY_DESC, FILE_NAME, FILE_DESC, FIELD_NAME, FIELD_DESC
I can access the descriptions via the UI, but wanted to generate a dynamic query.
Thanks.
Along with SYSTABLES and SYSCOLUMNS, there is also a SYSSCHEMAS which appears to contain the data you need. Please note that accessing this information through QSYS2 will restrict rows returned to those objects with which you have some access - the SYSIBM schema appears to disregard this (check the reference - for V6R1 it's about page 1267).
You also shouldn't need to retrieve this with a dynamic query - static with host variables (if necessary) will work just fine.