Hey everyone I am looking to figure out how to Create an Average over time with the following select from a table.
MinutesToLoad |Environment | Server | Exchange | StartTime |EndTime
140 ENVIRONMENT_A | Server2 | TAI | 2013-01-06 17:22:44.000 | 2013-01-06 19:42:53.000
135 ENVIRONMENT_A | Server2 | TAI | 2013-01-07 17:23:21.000 | 2013-01-07 19:38:37.000
130 ENVIRONMENT_A | Server2 | TAI | 2013-01-08 17:24:03.000 | 2013-01-08 19:34:17.000
130 ENVIRONMENT_A | Server2 | TAI | 2013-01-09 17:24:42.000 | 2013-01-09 19:34:45.000
140 ENVIRONMENT_A | Server1 | TAI | 2013-01-06 17:22:44.000 | 2013-01-06 19:42:53.000
135 ENVIRONMENT_A | Server1 | TAI | 2013-01-07 17:23:21.000 | 2013-01-07 19:38:37.000
130 ENVIRONMENT_A | Server1 | TAI | 2013-01-08 17:24:03.000 | 2013-01-08 19:34:17.000
130 ENVIRONMENT_A | Server1 | TAI | 2013-01-09 17:24:42.000 | 2013-01-09 19:34:45.000
1430 ENVIRONMENT_A | Server1 | SET | 2013-01-07 00:03:01.000 | 2013-01-07 23:53:37.000
1431 ENVIRONMENT_A | Server1 | SET | 2013-01-08 00:03:36.000 | 2013-01-08 23:54:14.000
1430 ENVIRONMENT_A | Server1 | SET | 2013-01-09 00:04:14.000 | 2013-01-09 23:54:55.000
1430 ENVIRONMENT_A | Server2 | SET | 2013-01-07 00:03:01.000 | 2013-01-07 23:53:37.000
1431 ENVIRONMENT_A | Server2 | SET | 2013-01-08 00:03:36.000 | 2013-01-08 23:54:14.000
1430 ENVIRONMENT_A | Server2 | SET | 2013-01-09 00:04:14.000 | 2013-01-09 23:54:55.000
1 ENVIRONMENT_A | Server2 | QXI | 2013-01-08 03:23:57.000 | 2013-01-08 03:24:02.000
1 ENVIRONMENT_A | Server1 | QXI | 2013-01-08 03:23:57.000 | 2013-01-08 03:24:02.000
MinutesToLoad is really a DATEDIFF(MINUTE,startTime,EndTime) on my select procedure, but the rest are from a table.
The goal result would be to have this result look like this:
MinutesToLoadOverLast10Days - Environment - Server - Exchange
133 ENVIRONMENT_A Server2 TAI
133 ENVIRONMENT_A Server1 TAI
1430 ENVIRONMENT_A Server1 SET
1430 ENVIRONMENT_A Server2 SET
1 ENVIRONMENT_A Server2 QXI
1 ENVIRONMENT_A Server1 QXI
I know I will be using the AVG function but I am lost on trying to figure out to use it to get the result I want.
Thank you in advance.
Select avg(MinutesToLoadOverLast10Days),Environment, Server,Exchange
FROM YOUR_TABLE_NAME
GROUP BY Environment, Server,Exchange
Related
I am using Postgresql 9.4.
I have this table recorded:
Colonne | Type | Modificateurs
---------+-----------------------+---------------
noemp | integer | non NULL
nomemp | character varying(15) |
emploi | character varying(14) |
mgr | integer |
dateemb | date |
sal | real |
comm | real |
nodept | integer |
Which has those values inside:
noemp | nomemp | emploi | mgr | dateemb | sal | comm | nodept
-------+-----------+----------------+------+------------+------+------+--------
7369 | SERGE | FONCTIONNAIRE | 7902 | 1980-12-07 | 800 | | 20
7499 | BRAHIM | VENDEUR | 7698 | 1981-02-20 | 1600 | 300 | 30
7521 | NASSIMA | VENDEUR | 7698 | 1981-02-22 | 1250 | 500 | 30
7566 | LUCIE | GESTIONNAIRE | 7839 | 1981-04-02 | 2975 | | 20
7654 | MARTIN | VENDEUR | 7698 | 1981-09-28 | 1250 | 1400 | 30
7698 | BENJAMIN | GESTIONNAIRE | 7839 | 1981-05-01 | 2850 | | 30
7782 | DAYANE | GESTIONNAIRE | 7839 | 1981-06-09 | 2450 | | 10
7788 | ARIJ | ANALYSTE | 7566 | 1982-12-09 | 3000 | | 20
7839 | MAYAR | PRESIDENT | | 1981-11-17 | 5000 | | 10
7844 | ROI | VENDEUR | 7698 | 1981-09-08 | 1500 | 0 | 30
7876 | VIRGINIE | FONCTIONNAIRE | 7788 | 0983-01-12 | 1100 | | 20
7902 | ASMA | ANALYSTE | 7566 | 1981-12-03 | 3000 | | 20
7934 | SIMONE | FONCTIONNAIRE | 7782 | 1982-01-23 | 1300 | | 10
7900 | LYNA | FONCTIONNAIRE | 7698 | 1981-12-03 | 950 | | 30
(14 lignes)
When I make a function to count the number of "nodept" with an asked value like this one:
CREATE OR REPLACE FUNCTION depcount(integer)RETURNS integer AS $$
DECLARE
somme integer;
BEGIN
SELECT DISTINCT(COUNT(*)) FROM EMP WHERE nodept=$1 INTO somme ;
RETURN somme;
END$$
LANGUAGE plpgsql;
with a SELECT depcount(30) FROM EMP;
I get this answer:
----------
6
6
6
6
6
6
6
6
6
6
6
6
6
6
(14 lignes)
14 results, as I should normally have only one.
I have to specify that I'm doing this for a course and I can't change the postgresql version, which must be 9.4.
If you have any idea why I get 14 results instead of one ?
thank you.
You're executing the function once per row, running the SELECT COUNT(*) 14 times and getting the result once for each row.
You probably want SELECT depcount(30) (without aFROM clause), to run the function only once.
On a side note, using a function for this sort of query is a bit overkill in most case in my opinion. You also don't need to use plpgsql, language sql would be enough here (though your function may be a bit more complicated than in your example). Using DISTINCT(COUNT(*)) doesn't really make sense either.
I have the following query:
SELECT
usersq1.id AS user_id, name, completed_at,
COUNT(usersq1.id) AS trips,
SUM(cost_amount_cents) AS daily_cost_amount_cents
FROM usersq1
LEFT OUTER JOIN tripsq1
ON usersq1.id = user_id
GROUP by usersq1.id, name, completed_at
ORDER by user_id, name, completed_at;
Which returns the following:
user_id | name | completed_at | trips | daily_cost_amount_cents
---------+---------------------+--------------+-------+-------------------------
1001 | Makeda Mosser | 2017-06-01 | 2 | 125
1001 | Makeda Mosser | 2017-06-02 | 1 | 125
1001 | Makeda Mosser | 2017-06-03 | 2 | 350
1001 | Makeda Mosser | 2017-06-04 | 2 | 200
1001 | Makeda Mosser | 2017-06-06 | 1 | 100
1001 | Makeda Mosser | 2017-06-07 | 1 | 125
1001 | Makeda Mosser | 2017-06-08 | 1 | 150
1002 | Libbie Luby | 2017-06-02 | 2 | 125
1002 | Libbie Luby | 2017-06-09 | 1 | 175
1003 | Linn Loughran | 2017-06-03 | 1 | 75
1004 | Natacha Ned | 2017-06-04 | 1 | 100
1005 | Lorrine Lunt | 2017-06-05 | 1 | 125
1006 | Tami Tineo | 2017-10-06 | 1 | 150
1007 | Delisa Deen | 2017-10-07 | 1 | 175
1008 | Mimi Miltenberger | 2017-10-08 | 1 | 200
1009 | Seth Sneller | 2017-10-09 | 1 | 25
1010 | Rickie Rossi | 2017-10-10 | 1 | 50
1011 | Jenise Jeanbaptiste | 2017-06-01 | 1 | 200
1011 | Jenise Jeanbaptiste | 2017-07-01 | 1 | 75
1012 | Genia Glatz | 2017-06-02 | 1 | 25
1012 | Genia Glatz | 2017-07-02 | 1 | 50
1013 | Onita Oddo | 2017-06-03 | 1 | 50
1014 | Dario Dreyer | 2017-06-04 | 1 | 75
1014 | Dario Dreyer | 2017-06-24 | 5 | 750
1015 | Toby Trent | | 1 |
I would like to produce another cumulative sum column which keeps a running total of daily_cost_amount_cents per user. The expected outlook I would like is something like this:
+---------+---------------------+------------+-------+-------------------------+-----------+
| user_id | name | created_at | trips | daily_cost_amount_cents | cum_cents |
+---------+---------------------+------------+-------+-------------------------+-----------+
| 1001 | Makeda Mosser | 6/1/17 | 2 | 125 | 125 |
| 1001 | Makeda Mosser | 6/2/17 | 1 | 125 | 250 |
| 1001 | Makeda Mosser | 6/3/17 | 2 | 350 | 600 |
| 1001 | Makeda Mosser | 6/4/17 | 2 | 200 | 800 |
| 1001 | Makeda Mosser | 6/6/17 | 1 | 100 | 900 |
| 1001 | Makeda Mosser | 6/7/17 | 1 | 125 | 1025 |
| 1001 | Makeda Mosser | 6/8/17 | 1 | 150 | 1175 |
| 1002 | Libbie Luby | 6/2/17 | 2 | 125 | 125 |
| 1002 | Libbie Luby | 6/9/17 | 1 | 175 | 300 |
| 1003 | Linn Loughran | 6/3/17 | 1 | 75 | 75 |
| 1004 | Natacha Ned | 6/4/17 | 1 | 100 | 100 |
| 1005 | Lorrine Lunt | 6/5/17 | 1 | 125 | 125 |
| 1006 | Tami Tineo | 10/6/17 | 1 | 150 | 150 |
| 1007 | Delisa Deen | 10/7/17 | 1 | 175 | 175 |
| 1008 | Mimi Miltenberger | 10/8/17 | 1 | 200 | 200 |
| 1009 | Seth Sneller | 10/9/17 | 1 | 25 | 25 |
| 1010 | Rickie Rossi | 10/10/17 | 1 | 50 | 50 |
| 1011 | Jenise Jeanbaptiste | 6/1/17 | 1 | 200 | 200 |
| 1011 | Jenise Jeanbaptiste | 7/1/17 | 1 | 75 | 275 |
| 1012 | Genia Glatz | 6/2/17 | 1 | 25 | 25 |
| 1012 | Genia Glatz | 7/2/17 | 1 | 50 | 75 |
| 1013 | Onita Oddo | 6/3/17 | 1 | 50 | 50 |
| 1014 | Dario Dreyer | 6/4/17 | 1 | 75 | 75 |
| 1014 | Dario Dreyer | 6/24/17 | 5 | 750 | 750 |
| 1015 | Toby Trent | | 0 | | |
+---------+---------------------+------------+-------+-------------------------+-----------+
I am pretty sure that I need to use a window function to do this but can't seem to do it while preserving the grouping by user_id and created_by
The problem is that in the presence of a GROUP BY clause, the window functions iterate over each group rather than multiple grouped rows. Put your query into a WITH clause and you can easily do the windowing you want:
WITH t AS (
SELECT usersq1.id AS user_id,
name,
completed_at,
COUNT(completed_at) AS trips, -- To correctly handle 0 trips
SUM(cost_amount_cents) AS daily_cost_amount_cents
FROM usersq1
LEFT OUTER JOIN tripsq1 ON usersq1.id = user_id
GROUP BY usersq1.id, name, completed_at
ORDER BY user_id, name, completed_at
) SELECT user_id,
name,
completed_at AS created_at,
trips,
daily_cost_amount_cents,
SUM(daily_cost_amount_cents) OVER (PARTITION BY user_id
ORDER BY user_id, completed_at)
FROM t;
I am using postgresql and I have a table called accidents (state, total accidents) and another table called population. I want to get the top 3 state names with high total accidents and then get the population of those 3 states divided by total accidents in postgresql? How to write the query in the following way?
Explanation:
Population Table
rank| state | population
---+-----------------------------+------------
1 | Uttar Pradesh | 199581477
2 | Maharashtra | 112372972
3 | Bihar | 103804630
4 | West Bengal | 91347736
5 | Madhya Pradesh | 72597565
6 | Tamil Nadu | 72138958
7 | Rajasthan | 68621012
8 | Karnataka | 61130704
9 | Gujarat | 60383628
10 | Andhra Pradesh | 49665533
11 | Odisha | 41947358
12 | Telangana | 35193978
13 | Kerala | 33387677
14 | Jharkhand | 32966238
15 | Assam | 31169272
16 | Punjab | 27704236
17 | Haryana | 25753081
18 | Chhattisgarh | 25540196
19 | Jammu and Kashmir | 12548926
20 | Uttarakhand | 10116752
21 | Himachal Pradesh | 6856509
22 | Tripura | 3671032
23 | Meghalaya | 2964007
24 | Manipur*β* | 2721756
25 | Nagaland | 1980602
26 | Goa | 1457723
27 | Arunachal Pradesh | 1382611
28 | Mizoram | 1091014
29 | Sikkim | 607688
30 | Delhi | 16753235
31 | Puducherry | 1244464
32 | Chandigarh | 1054686
33 | Andaman and Nicobar Islands | 379944
34 | Dadra and Nagar Haveli | 342853
35 | Daman and Diu | 242911
36 | Lakshadweep | 64429
accident table:
state | eqto8 | eqto10 | mrthn10 | ntknwn | total
-----------------------------+-------+--------+---------+--------+--------
Andhra Pradesh | 6425 | 8657 | 8144 | 19298 | 42524
Arunachal Pradesh | 88 | 76 | 87 | 0 | 251
Assam | 0 | 0 | 0 | 6535 | 6535
Bihar | 2660 | 3938 | 3722 | 0 | 10320
Chhattisgarh | 2888 | 7052 | 3571 | 0 | 13511
Goa | 616 | 1512 | 2184 | 0 | 4312
Gujarat | 4864 | 7864 | 7132 | 8089 | 27949
Haryana | 3365 | 2588 | 4112 | 0 | 10065
Himachal Pradesh | 276 | 626 | 977 | 1020 | 2899
Jammu and Kashmir | 1557 | 618 | 434 | 4100 | 6709
Jharkhand | 1128 | 701 | 1037 | 2845 | 5711
Karnataka | 11167 | 14715 | 18566 | 0 | 44448
Kerala | 5580 | 13271 | 17323 | 0 | 36174
Madhya Pradesh | 15630 | 16226 | 19354 | 0 | 51210
Maharashtra | 4117 | 5350 | 10538 | 46311 | 66316
Manipur | 147 | 453 | 171 | 0 | 771
Meghalaya | 210 | 154 | 119 | 0 | 483
Mizoram | 27 | 58 | 25 | 0 | 110
Nagaland | 11 | 13 | 18 | 0 | 42
Odisha | 1881 | 3120 | 4284 | 0 | 9285
Punjab | 1378 | 2231 | 1825 | 907 | 6341
Rajasthan | 5534 | 5895 | 5475 | 6065 | 22969
Sikkim | 6 | 144 | 8 | 0 | 158
Tamil Nadu | 8424 | 18826 | 29871 | 10636 | 67757
Tripura | 290 | 376 | 222 | 0 | 888
Uttarakhand | 318 | 305 | 456 | 393 | 1472
Uttar Pradesh | 8520 | 10457 | 10995 | 0 | 29972
West Bengal | 1494 | 1311 | 974 | 8511 | 12290
Andaman and Nicobar Islands | 18 | 104 | 114 | 0 | 236
Chandigarh | 112 | 39 | 210 | 58 | 419
Dadra and Nagar Haveli | 40 | 20 | 17 | 8 | 85
Daman and Diu | 11 | 6 | 8 | 25 | 50
Delhi | 0 | 0 | 0 | 6937 | 6937
Lakshadweep | 0 | 0 | 0 | 3 | 3
Puducherry | 154 | 668 | 359 | 0 | 1181
All India | 88936 | 127374 | 152332 | 121741 | 490383
So that result should be
21.57
81.03
107.44
explanation:
Highest accidents states Tamilnadu, Maharashtra, Madhyapradesh.
Tamilnadu population/accidents = 21213/983 = 21.57 (Assumed values)
Maharasthra population/accidents = 10000/123 = 81.03
Madhyapradesh population/accidents = 34812/324 = 107.44
My query is:
SELECT POPULATION/
(SELECT TOTAL
FROM accidents
WHERE STATE NOT LIKE 'All %'
ORDER BY TOTAL DESC
LIMIT 3)
aVG FROM population
WHERE STATE IN
(SELECT STATE
FROM accidents
WHERE STATE NOT LIKE 'All %'
ORDER BY TOTAL DESC
LIMIT 3);
throwing ERROR: more than one row returned by a subquery used as an expression.
How to modify the query to get the required result or any other way to get the result in postgresql?
This ought to do it.
SELECT a.state, population.population/a.total FROM
(SELECT total, state FROM accidents WHERE state <> 'All India' ORDER BY total DESC LIMIT 3 ) AS a
INNER JOIN population on a.state = population.state
I have a table which contains data use Sql server 2008 r2
+-----+------+--------+-------+------+-------+
| ID | Kind | Date | Price | Type | Amount|
+-----+------+--------+-------+------+-------+
| 525 | 32 |1/1/2016| 240 | 0 | 3000 |
| 525 | 32 |1/1/2016| 380 | 1 | 3000 |
| 525 | 32 |1/1/2016| 240 | 0 | 4000 |
| 525 | 32 |1/1/2016| 380 | 1 | 4000 |
+-----+------+--------+-------+------+-------+
How can I get this result?
+-----+------+--------+-------+------+-------+
| ID | Kind | Date | Price | Type | Amount|
+-----+------+--------+-------+------+-------+
| 525 | 32 |1/1/2016| 240 | 0 | 3000 |
| 525 | 32 |1/1/2016| 380 | 1 | 4000 |
+-----+------+--------+-------+------+-------+
Will this not do?
SELECT DISTNCT ID, Kind, Date, Price, Type, Amount FROM dbo.yourTable
I'm fairly new to Postgresql (and to SQL itself) so forgive me if I'm missing something (yes, I did try to use the search as well).
I'm trying to use the average of a resulting column from a query in the HAVING clause. Is it possible? Or is there a better solution?
I would like the column with Alias 'Sent_traffic' to display only those rows with a value greater than or equal to the average of that column.
The following query is what I would like to do (doesn't work because of the HAVING clause):
select p.name, SUM(f.bytes_ab) as Sent_traffic
from pcap_flow f
inner join ports p
ON f.source_ip = p.source_ip AND f.source_port=p.source_port AND p.destination_ip=f.destination_ip AND p.destination_port=f.destination_port
WHERE p.logged_at BETWEEN f.first_packet AND f.last_packet
AND p.logged_at BETWEEN '2016-03-15' AND '2016-04-01'
AND p.name!=''
GROUP BY p.name
HAVING SUM(f.bytes_ba) >= AVG(Sent_traffic)
ORDER BY Sent DESC
LIMIT 10;
What does work is the the same query without the AVG(Sent) in the HAVING clause:
select p.name, SUM(f.bytes_ab) as Sent_traffic
from pcap_flow f
inner join ports p
ON f.source_ip = p.source_ip AND f.source_port=p.source_port AND p.destination_ip=f.destination_ip AND p.destination_port=f.destination_port
WHERE p.logged_at BETWEEN f.first_packet AND f.last_packet
AND p.logged_at BETWEEN '2016-03-15' AND '2016-04-01'
AND p.name!=''
GROUP BY p.name
HAVING SUM(f.bytes_ba) >= 999999
ORDER BY Sent DESC
LIMIT 10;
A snapshot of the tables is as follows:
Table pcap_flow:
pcap_flow table in PgAdmin
Table ports:
ports table in PgAdmin
pcap_flow table:
id | pcap_id | flow_code | source_ip | source_port | destination_ip | destination_port | protocol | first_packet | last_packet | status | total_time | total_packets | idle_time_ab | idle_time_ba | bytes_ab | bytes_ba
-------+---------+-----------+---------------------------+-------------+---------------------------+------------------+----------+----------------------------+----------------------------+--------------------------+--------------+---------------+--------------+--------------+----------+----------
1 | 42 | a2b | 192.168.0.22 | 50191 | 128.93.101.81 | 443 | T | 2016-03-25 09:43:46.184039 | 2016-03-25 09:43:55.950184 | reset | 9.766144 | 10 | 9.7053 | 9.7065 | 510 | 3601
2 | 42 | c2d | 192.168.0.22 | 50127 | 74.125.133.189 | 443 | T | 2016-03-25 09:43:46.212468 | 2016-03-25 09:44:04.860872 | reset (syns 0) (fins 1) | 18.648403 | 6 | | | 0 | 0
3 | 42 | e2f | 192.168.0.22 | 50194 | 192.168.0.254 | 80 | T | 2016-03-25 09:43:46.302105 | 2016-03-25 09:43:49.615557 | yes | 3.313451 | 10 | 1.7933 | 2.0006 | 336 | 1421
4 | 42 | g2h | 192.168.0.22 | 50196 | 104.16.27.216 | 80 | T | 2016-03-25 09:43:46.335128 | 2016-03-25 09:43:46.454677 | yes | 0.119549 | 5 | 0.1172 | 0.1162 | 236 | 1705
5 | 42 | i2j | 192.168.0.254 | 443 | 192.168.0.22 | 50190 | T | 2016-03-25 09:43:46.420872 | 2016-03-25 09:43:46.422176 | no (syns 0) (fins 2) | 0.001304 | 6 | 0.0012 | 0.0002 | 2063 | 74
6 | 42 | k2l | 192.168.0.22 | 50197 | 192.168.0.254 | 443 | T | 2016-03-25 09:43:46.457142 | 2016-03-25 09:43:57.94442 | yes | 11.487277 | 26 | 2.001 | 2.001 | 3678 | 18859
7 | 42 | m2n | 192.168.0.22 | 50170 | 192.168.0.254 | 443 | T | 2016-03-25 09:43:46.509135 | 2016-03-25 09:43:46.51023 | reset (syns 0) (fins 2) | 0.001095 | 2 | 0.001 | 0.0001 | 0 | 37
8 | 42 | o2p | 192.168.0.22 | 50161 | 54.149.211.23 | 443 | T | 2016-03-25 09:43:46.510764 | 2016-03-25 09:43:46.512014 | reset (syns 0) (fins 2) | 0.00125 | 3 | 0.0011 | 0 | 37 | 37
9 | 42 | q2r | 192.168.0.22 | 50198 | 192.168.0.254 | 443 | T | 2016-03-25 09:43:46.511504 | 2016-03-25 09:43:46.744645 | reset | 0.233141 | 7 | 0.138 | 0.0981 | 385 | 4342
10 | 42 | s2t | 192.168.0.22 | 50199 | 192.168.0.254 | 443 | T | 2016-03-25 09:43:46.511667 | 2016-03-25 09:43:46.962999 | reset | 0.451332 | 8 | 0.2478 | 0.2018 | 385 | 4342
11 | 42 | u2v | 192.168.0.22 | 50200 | 104.16.27.216 | 80 | T | 2016-03-25 09:43:46.600772 | 2016-03-25 09:43:46.776045 | yes | 0.175273 | 5 | 0.166 | 0.1648 | 463 | 1604
12 | 42 | w2x | 192.168.0.22 | 50201 | 104.16.27.216 | 80 | T | 2016-03-25 09:43:46.606515 | 2016-03-25 09:43:46.760278 | yes | 0.153763 | 6 | 0.1517 | 0.1504 | 463 | 1604
13 | 42 | y2z | 192.168.0.22 | 50163 | 172.217.16.78 | 443 | T | 2016-03-25 09:43:46.744559 | 2016-03-25 09:43:46.746244 | reset (syns 0) (fins 2) | 0.001685 | 3 | 0.0014 | 0.0001 | 37 | 37
14 | 42 | aa2ab | 192.168.0.22 | 50202 | 192.168.0.254 | 443 | T | 2016-03-25 09:43:46.763646 | 2016-03-25 09:43:48.673286 | reset | 1.90964 | 8 | 1.5839 | 1.789 | 397 | 4342
15 | 42 | ac2ad | 192.168.0.22 | 50203 | 104.16.27.216 | 80 | T | 2016-03-25 09:43:46.854744 | 2016-03-25 09:43:46.998117 | yes | 0.143373 | 5 | 0.1414 | 0.1401 | 463 | 1604
ports table:
id | session_id | pid | name | protocol | source_ip | destination_ip | source_port | destination_port | state | logged_at
--------+------------+-------+-------------------------+----------+---------------------------+---------------------------+-------------+------------------+-------+-------------------------
1 | 1 | 676 | svchost.exe | 17 | 0.0.0.0 | | 68 | | 2 | 2016-03-16 09:41:04.716
2 | 1 | 4 | | 17 | 192.168.0.22 | | 137 | | 2 | 2016-03-16 09:41:04.716
3 | 1 | 4 | | 17 | 192.168.0.22 | | 138 | | 2 | 2016-03-16 09:41:04.716
4 | 1 | 3408 | svchost.exe | 17 | 127.0.0.1 | | 1900 | | 2 | 2016-03-16 09:41:04.716
5 | 1 | 3408 | svchost.exe | 17 | 192.168.0.22 | | 1900 | | 2 | 2016-03-16 09:41:04.716
6 | 1 | 3092 | uoipservice.exe | 17 | 192.168.0.22 | | 1900 | | 2 | 2016-03-16 09:41:04.716
7 | 1 | 2208 | mdnsresponder.exe | 17 | 192.168.0.22 | | 5353 | | 2 | 2016-03-16 09:41:04.716
8 | 1 | 1032 | svchost.exe | 17 | 0.0.0.0 | | 5355 | | 2 | 2016-03-16 09:41:04.716
9 | 1 | 2208 | mdnsresponder.exe | 17 | 0.0.0.0 | | 49152 | | 2 | 2016-03-16 09:41:04.716
10 | 1 | 3092 | uoipservice.exe | 17 | 192.168.0.22 | | 51128 | | 2 | 2016-03-16 09:41:04.716
11 | 1 | 3092 | uoipservice.exe | 17 | 192.168.0.22 | | 51129 | | 2 | 2016-03-16 09:41:04.716
12 | 1 | 3408 | svchost.exe | 17 | 192.168.0.22 | | 61182 | | 2 | 2016-03-16 09:41:04.716
13 | 1 | 3408 | svchost.exe | 17 | 127.0.0.1 | | 61183 | | 2 | 2016-03-16 09:41:04.716
14 | 1 | 676 | svchost.exe | 17 | fe80::1d77:665c:b2e5:3be3 | | 546 | | 2 | 2016-03-16 09:41:04.716
15 | 1 | 3408 | svchost.exe | 17 | ::1 | | 1900 | | 2 | 2016-03-16 09:41:04.716
16 | 1 | 3408 | svchost.exe | 17 | fe80::1d77:665c:b2e5:3be3 | | 1900 | | 2 | 2016-03-16 09:41:04.716
17 | 1 | 2208 | mdnsresponder.exe | 17 | ::1 | | 5353 | | 2 | 2016-03-16 09:41:04.716
18 | 1 | 1032 | svchost.exe | 17 | :: | | 5355 | | 2 | 2016-03-16 09:41:04.716
19 | 1 | 2208 | mdnsresponder.exe | 17 | :: | | 49153 | | 2 | 2016-03-16 09:41:04.716
20 | 1 | 3408 | svchost.exe | 17 | fe80::1d77:665c:b2e5:3be3 | | 61180 | | 2 | 2016-03-16 09:41:04.716
21 | 1 | 3408 | svchost.exe | 17 | ::1 | | 61181 | | 2 | 2016-03-16 09:41:04.716
22 | 1 | 972 | svchost.exe | 6 | 0.0.0.0 | 0.0.0.0 | 135 | 0 | 2 | 2016-03-16 09:41:04.716
23 | 1 | 4 | | 6 | 192.168.0.22 | 0.0.0.0 | 139 | 0 | 2 | 2016-03-16 09:41:04.716
24 | 1 | 1728 | devmonsrv.exe | 6 | 127.0.0.1 | 0.0.0.0 | 515 | 0 | 2 | 2016-03-16 09:41:04.716
25 | 1 | 2208 | mdnsresponder.exe | 6 | 127.0.0.1 | 0.0.0.0 | 5354 | 0 | 2 | 2016-03-16 09:41:04.716
26 | 1 | 2564 | hostviewcli.exe | 6 | 127.0.0.1 | 0.0.0.0 | 40123 | 0 | 2 | 2016-03-16 09:41:04.716
27 | 1 | 716 | wininit.exe | 6 | 0.0.0.0 | 0.0.0.0 | 49152 | 0 | 2 | 2016-03-16 09:41:04.716
28 | 1 | 676 | svchost.exe | 6 | 0.0.0.0 | 0.0.0.0 | 49153 | 0 | 2 | 2016-03-16 09:41:04.716
29 | 1 | 1076 | svchost.exe | 6 | 0.0.0.0 | 0.0.0.0 | 49154 | 0 | 2 | 2016-03-16 09:41:04.716
30 | 1 | 772 | services.exe | 6 | 0.0.0.0 | 0.0.0.0 | 49155 | 0 | 2 | 2016-03-16 09:41:04.716
31 | 1 | 4872 | vpnui.exe | 6 | 127.0.0.1 | 127.0.0.1 | 49156 | 62522 | 5 | 2016-03-16 09:41:04.716
32 | 1 | 788 | lsass.exe | 6 | 0.0.0.0 | 0.0.0.0 | 49157 | 0 | 2 | 2016-03-16 09:41:04.716
33 | 1 | 3092 | uoipservice.exe | 6 | 192.168.0.22 | 0.0.0.0 | 49164 | 0 | 2 | 2016-03-16 09:41:04.716