Related
We implemented the Google Combo chart with some horizontal labels in place. But somehow its not showing the first label. Does anybody have any insight in why its not working?
Example: https://www.cdfund.com/track-record/rendement/nac.html
Code example:
var data = new google.visualization.DataTable();
data.addColumn('date', 'Time of measurement');
data.addColumn('number', 'Benchmark (50%/50% TSX-V/HUI) ');
data.addColumn('number', 'CDF NAC ');
data.addRows([[new Date(2018, 0, 1),42.09,82.47,],[new Date(2018, 1, 1),42.88,82.47,],[new Date(2018, 2, 1),39.33,78.26,],[new Date(2018, 3, 1),38.96,72.98,],[new Date(2018, 4, 1),38.98,77.62,],[new Date(2018, 5, 1),38.64,79.53,],[new Date(2018, 6, 1),37.46,75.12,],[new Date(2018, 7, 1),35.75,72.28,],[new Date(2018, 8, 1),33.72,69.29,],[new Date(2018, 9, 1),33.10,71.27,],[new Date(2018, 10, 1),31.72,68.62,],[new Date(2018, 11, 1),30.54,65.53,],[new Date(2019, 0, 1),31.49,61.23,],[new Date(2019, 1, 1),34.30,64.15,],[new Date(2019, 2, 1),34.11,64.13,],[new Date(2019, 3, 1),34.37,63.52,],[new Date(2019, 4, 1),32.61,58.88,],[new Date(2019, 5, 1),32.38,56.60,],[new Date(2019, 6, 1),35.77,59.77,],[new Date(2019, 7, 1),36.44,62.15,],[new Date(2019, 8, 1),39.01,65.34,],[new Date(2019, 9, 1),35.86,61.54,],[new Date(2019, 10, 1),36.70,60.51,],[new Date(2019, 11, 1),36.03,59.00,],[new Date(2020, 0, 1),39.85,67.53,],[new Date(2020, 1, 1),39.15,66.76,],[new Date(2020, 2, 1),34.93,59.35,],[new Date(2020, 3, 1),28.78,50.16,],[new Date(2020, 4, 1),38.07,69.69,],[new Date(2020, 5, 1),41.80,79.14,],[new Date(2020, 6, 1),45.95,91.51,],[new Date(2020, 7, 1),54.05,104.16,],[new Date(2020, 8, 1),55.26,116.85,],[new Date(2020, 9, 1),51.67,115.98,],[new Date(2020, 10, 1),49.87,111.20,],[new Date(2020, 11, 1),49.84,113.11,],[new Date(2021, 0, 1),55.39,125.83,],[new Date(2021, 1, 1),55.39,117.29,],[new Date(2021, 2, 1),56.02,116.46,],[new Date(2021, 3, 1),54.85,113.09,],[new Date(2021, 4, 1),55.98,123.36,],[new Date(2021, 5, 1),60.81,133.58,],[new Date(2021, 6, 1),55.63,120.68,],[new Date(2021, 7, 1),55.32,118.26,],[new Date(2021, 8, 1),52.44,111.19,],[new Date(2021, 9, 1),48.82,102.59,],[new Date(2021, 10, 1),53.49,113.06,],[new Date(2021, 11, 1),53.79,109.98,],[new Date(2022, 0, 1),54.24,114.31,],[new Date(2022, 1, 1),50.69,106.74,],[new Date(2022, 2, 1),53.79,112.16,],[new Date(2022, 3, 1),58.19,118.96,],[new Date(2022, 4, 1),52.91,113.69,],[new Date(2022, 5, 1),47.26,102.92,],[new Date(2022, 6, 1),40.73,86.32,],[new Date(2022, 7, 1),40.44,95.37,],[new Date(2022, 8, 1),38.20,92.43,],[new Date(2022, 9, 1),37.64,81.94,],[new Date(2022, 10, 1),37.82,81.27,],[new Date(2022, 11, 1),,,]]);
var options = {
hAxis: {
format: 'yyyy',
gridlines: { count: 5, color: 'transparent' },
ticks: [new Date(2018, 3, 1), new Date(2019, 1, 1), new Date(2020, 1, 1), new Date(2021, 1, 1), new Date(2022, 1, 1)],
minorGridlines: { color: 'transparent' },
textStyle: { color: '#000', fontSize: 8 }
},
vAxis: {
minorGridlines: { color: 'transparent' },
gridlines: { count: 4 },
textStyle: { color: '#706345', italic: true, fontSize: 8 },
textPosition: 'in',
},
height: '360',
colors: ['#CB9B01','#AA9870','#C2AE81','#706345','#E2D7BD'],
backgroundColor: '#F4F3F0',
chartArea: { 'width': '90%', 'height': '65%' },
legend: { 'position': 'bottom', padding: 30 },
seriesType: 'area',
series: { 1: { type: 'line' }, 2: { type: 'line' }, 3: { type: 'line' }, 4: { type: 'line' }, 5: { type: 'line' } }
};
Thanks
I have a list of id in my page
var index = ['0', '5', '9'];
from a list of class in costant.dart
static final List<SensationItem> sensastionList = [
SensationItem(
code: 0,
title: 'Formicolio / Intorpidimento',
imageUrl: 'images/intorpidimento.png'),
SensationItem(
code: 1,
title: 'Sudorazione intensa',
imageUrl: 'images/sudorazione.png'),
SensationItem(
code: 2, title: 'Dolore al petto', imageUrl: 'images/petto.png'),
SensationItem(code: 3, title: 'Nausea', imageUrl: 'images/nausea.png'),
SensationItem(code: 4, title: 'Tremori', imageUrl: 'images/tremori.png'),
SensationItem(
code: 5,
title: 'Paura di perdere il controllo',
imageUrl: 'images/controllo.png'),
SensationItem(
code: 6,
title: 'Sbandamento / Vertigini',
imageUrl: 'images/vertigini.png'),
SensationItem(
code: 7, title: 'Palpitazioni', imageUrl: 'images/palpitazioni.png'),
SensationItem(
code: 8,
title: 'Sensazione di soffocamento',
imageUrl: 'images/soffocamento.png'),
];
}
How I can convert and find the index element in List
Ex.:
0 => 'Formicolio / Intorpidimento'
4 => 'Tremori'
And I want to add at another list and print it in a widget
Try something like this:
I used sensationList.where to filter the results that where in your list of indexes (I turned them into integers first so that they could match) and them used forEach() to add the results into the new list of SensationItem like you asked
You can copy and paste it into DartPad to see how it's working
void main() {
final List<SensationItem> sensastionList = [
SensationItem(
code: 0,
title: 'Formicolio / Intorpidimento',
imageUrl: 'images/intorpidimento.png'),
SensationItem(
code: 1,
title: 'Sudorazione intensa',
imageUrl: 'images/sudorazione.png'),
SensationItem(
code: 2, title: 'Dolore al petto', imageUrl: 'images/petto.png'),
SensationItem(code: 3, title: 'Nausea', imageUrl: 'images/nausea.png'),
SensationItem(code: 4, title: 'Tremori', imageUrl: 'images/tremori.png'),
SensationItem(
code: 5,
title: 'Paura di perdere il controllo',
imageUrl: 'images/controllo.png'),
SensationItem(
code: 6,
title: 'Sbandamento / Vertigini',
imageUrl: 'images/vertigini.png'),
SensationItem(
code: 7, title: 'Palpitazioni', imageUrl: 'images/palpitazioni.png'),
SensationItem(
code: 8,
title: 'Sensazione di soffocamento',
imageUrl: 'images/soffocamento.png'),
];
var index = [0, 5, 9];
List<SensationItem> newList = [];
sensastionList.where((element) => index.contains(element.code)).forEach((item) => newList.add(item));
print(newList);
}
class SensationItem {
int code;
String title;
String imageUrl;
SensationItem({this.title, this.imageUrl, this.code});
#override
String toString() => 'Sensation item - code: ${this.code} title: ${this.title}';
}
I have a data frame in pyspark which has hundreds of millions of rows (here is a dummy sample of it):
import datetime
import pyspark.sql.functions as F
from pyspark.sql import Window,Row
from pyspark.sql.functions import col
from pyspark.sql.functions import month, mean,sum,year,avg
from pyspark.sql.functions import concat_ws,to_date,unix_timestamp,datediff,lit
from pyspark.sql.functions import when,min,max,desc,row_number,col
dg = sqlContext.createDataFrame(sc.parallelize([
Row(cycle_dt=datetime.datetime(1984, 5, 2, 0, 0), network_id=4,norm_strength=0.5, spend_active_ind=1,net_spending_amt=0,cust_xref_id=10),
Row(cycle_dt=datetime.datetime(1984, 6, 2, 0, 0), network_id=4,norm_strength=0.5, spend_active_ind=1,net_spending_amt=2,cust_xref_id=11),
Row(cycle_dt=datetime.datetime(1984, 7, 2, 0, 0), network_id=4,norm_strength=0.5, spend_active_ind=1,net_spending_amt=2,cust_xref_id=12),
Row(cycle_dt=datetime.datetime(1984, 4, 2, 0, 0), network_id=4,norm_strength=0.5, spend_active_ind=1,net_spending_amt=2,cust_xref_id=13),
Row(cycle_dt=datetime.datetime(1983,11, 5, 0, 0), network_id=1,norm_strength=0.5, spend_active_ind=0,net_spending_amt=8,cust_xref_id=1 ),
Row(cycle_dt=datetime.datetime(1983,12, 2, 0, 0), network_id=1,norm_strength=0.5, spend_active_ind=0,net_spending_amt=2,cust_xref_id=1 ),
Row(cycle_dt=datetime.datetime(1984, 1, 3, 0, 0), network_id=1,norm_strength=0.5, spend_active_ind=1,net_spending_amt=15,cust_xref_id=1 ),
Row(cycle_dt=datetime.datetime(1984, 3, 2, 0, 0), network_id=1,norm_strength=0.5, spend_active_ind=0,net_spending_amt=7,cust_xref_id=1 ),
Row(cycle_dt=datetime.datetime(1984, 4, 3, 0, 0), network_id=1,norm_strength=0.5, spend_active_ind=0,net_spending_amt=1,cust_xref_id=1 ),
Row(cycle_dt=datetime.datetime(1984, 5, 2, 0, 0), network_id=1,norm_strength=0.5, spend_active_ind=0,net_spending_amt=1,cust_xref_id=1 ),
Row(cycle_dt=datetime.datetime(1984,10, 6, 0, 0), network_id=1,norm_strength=0.5, spend_active_ind=1,net_spending_amt=10,cust_xref_id=1 ),
Row(cycle_dt=datetime.datetime(1984, 1, 7, 0, 0), network_id=1,norm_strength=0.4, spend_active_ind=0,net_spending_amt=8,cust_xref_id=2 ),
Row(cycle_dt=datetime.datetime(1984, 1, 2, 0, 0), network_id=1,norm_strength=0.4, spend_active_ind=0,net_spending_amt=3,cust_xref_id=2 ),
Row(cycle_dt=datetime.datetime(1984, 2, 7, 0, 0), network_id=1,norm_strength=0.4, spend_active_ind=1,net_spending_amt=5,cust_xref_id=2 ),
Row(cycle_dt=datetime.datetime(1985, 2, 7, 0, 0), network_id=1,norm_strength=0.3, spend_active_ind=1,net_spending_amt=8,cust_xref_id=3 ),
Row(cycle_dt=datetime.datetime(1985, 3, 7, 0, 0), network_id=1,norm_strength=0.3, spend_active_ind=0,net_spending_amt=2,cust_xref_id=3 ),
Row(cycle_dt=datetime.datetime(1985, 4, 7, 0, 0), network_id=1,norm_strength=0.3, spend_active_ind=1,net_spending_amt=1,cust_xref_id=3 ),
Row(cycle_dt=datetime.datetime(1985, 4, 8, 0, 0), network_id=1,norm_strength=0.3, spend_active_ind=1,net_spending_amt=9,cust_xref_id=3 ),
Row(cycle_dt=datetime.datetime(1984, 4, 2, 0, 0), network_id=2,norm_strength=0.5, spend_active_ind=0,net_spending_amt=3,cust_xref_id=4 ),
Row(cycle_dt=datetime.datetime(1984, 4, 3, 0, 0), network_id=2,norm_strength=0.5, spend_active_ind=0,net_spending_amt=2,cust_xref_id=4 ),
Row(cycle_dt=datetime.datetime(1984, 1, 2, 0, 0), network_id=2,norm_strength=0.5, spend_active_ind=0,net_spending_amt=5,cust_xref_id=4 ),
Row(cycle_dt=datetime.datetime(1984, 1, 3, 0, 0), network_id=2,norm_strength=0.5, spend_active_ind=1,net_spending_amt=6,cust_xref_id=4 ),
Row(cycle_dt=datetime.datetime(1984, 3, 2, 0, 0), network_id=2,norm_strength=0.5, spend_active_ind=0,net_spending_amt=2,cust_xref_id=4 ),
Row(cycle_dt=datetime.datetime(1984, 1, 5, 0, 0), network_id=2,norm_strength=0.5, spend_active_ind=0,net_spending_amt=9,cust_xref_id=4 ),
Row(cycle_dt=datetime.datetime(1984, 1, 6, 0, 0), network_id=2,norm_strength=0.5, spend_active_ind=1,net_spending_amt=1,cust_xref_id=4 ),
Row(cycle_dt=datetime.datetime(1984, 1, 7, 0, 0), network_id=2,norm_strength=0.4, spend_active_ind=0,net_spending_amt=7,cust_xref_id=5 ),
Row(cycle_dt=datetime.datetime(1984, 1, 2, 0, 0), network_id=2,norm_strength=0.4, spend_active_ind=0,net_spending_amt=8,cust_xref_id=5 ),
Row(cycle_dt=datetime.datetime(1984, 2, 7, 0, 0), network_id=2,norm_strength=0.4, spend_active_ind=1,net_spending_amt=3,cust_xref_id=5 ),
Row(cycle_dt=datetime.datetime(1985, 2, 7, 0, 0), network_id=2,norm_strength=0.6, spend_active_ind=1,net_spending_amt=6,cust_xref_id=6 ),
Row(cycle_dt=datetime.datetime(1985, 3, 7, 0, 0), network_id=2,norm_strength=0.6, spend_active_ind=0,net_spending_amt=9,cust_xref_id=6 ),
Row(cycle_dt=datetime.datetime(1985, 4, 7, 0, 0), network_id=2,norm_strength=0.6, spend_active_ind=1,net_spending_amt=4,cust_xref_id=6 ),
Row(cycle_dt=datetime.datetime(1985, 4, 8, 0, 0), network_id=2,norm_strength=0.6, spend_active_ind=1,net_spending_amt=6,cust_xref_id=6 ),
Row(cycle_dt=datetime.datetime(1984, 4, 2, 0, 0), network_id=3,norm_strength=0.5, spend_active_ind=0,net_spending_amt=0,cust_xref_id=7 ),
Row(cycle_dt=datetime.datetime(1984, 4, 3, 0, 0), network_id=3,norm_strength=0.5, spend_active_ind=0,net_spending_amt=0,cust_xref_id=7 ),
Row(cycle_dt=datetime.datetime(1984, 1, 2, 0, 0), network_id=3,norm_strength=0.5, spend_active_ind=0,net_spending_amt=0,cust_xref_id=7 ),
Row(cycle_dt=datetime.datetime(1984, 1, 3, 0, 0), network_id=3,norm_strength=0.5, spend_active_ind=0,net_spending_amt=0,cust_xref_id=7 ),
Row(cycle_dt=datetime.datetime(1984, 3, 2, 0, 0), network_id=3,norm_strength=0.5, spend_active_ind=0,net_spending_amt=0,cust_xref_id=7 ),
Row(cycle_dt=datetime.datetime(1984, 1, 5, 0, 0), network_id=3,norm_strength=0.5, spend_active_ind=0,net_spending_amt=0,cust_xref_id=7 ),
Row(cycle_dt=datetime.datetime(1984, 1, 6, 0, 0), network_id=3,norm_strength=0.5, spend_active_ind=0,net_spending_amt=0,cust_xref_id=7 ),
Row(cycle_dt=datetime.datetime(1984, 1, 7, 0, 0), network_id=3,norm_strength=0.4, spend_active_ind=0,net_spending_amt=3,cust_xref_id=8 ),
Row(cycle_dt=datetime.datetime(1984, 1, 2, 0, 0), network_id=3,norm_strength=0.4, spend_active_ind=0,net_spending_amt=2,cust_xref_id=8 ),
Row(cycle_dt=datetime.datetime(1984, 2, 7, 0, 0), network_id=3,norm_strength=0.4, spend_active_ind=1,net_spending_amt=8,cust_xref_id=8 ),
Row(cycle_dt=datetime.datetime(1985, 2, 7, 0, 0), network_id=3,norm_strength=0.6, spend_active_ind=1,net_spending_amt=4,cust_xref_id=9 ),
Row(cycle_dt=datetime.datetime(1985, 3, 7, 0, 0), network_id=3,norm_strength=0.6, spend_active_ind=0,net_spending_amt=1,cust_xref_id=9 ),
Row(cycle_dt=datetime.datetime(1985, 4, 7, 0, 0), network_id=3,norm_strength=0.6, spend_active_ind=1,net_spending_amt=9,cust_xref_id=9 ),
Row(cycle_dt=datetime.datetime(1985, 4, 8, 0, 0), network_id=3,norm_strength=0.6, spend_active_ind=0,net_spending_amt=3,cust_xref_id=9 )
]))
I am trying to sumspend_active_ind for each cust_xref_id and keep those with sum more than zero. One way to do this is using grouby and join:
dg1 = dg.groupby("cust_xref_id").agg(sum("spend_active_ind").alias("sum_spend_active_ind"))
dg1 = dg1.filter(dg1.sum_spend_active_ind != 0).select("cust_xref_id")
dg = dg.alias("t1").join(dg1.alias("t2"),col("t1.cust_xref_id")==col("t2.cust_xref_id")).select(col("t1.*"))
The other way I can think of it is using window:
w = Window.partitionBy ('cust_xref_id')
dg = dg.withColumn('sum_spend_active_ind',sum(dg.spend_active_ind).over(w))
dg = dg.filter(dg.sum_spend_active_ind!=0)
which one of these methods (or any other method) is more efficient for what I am trying to do.
Thanks
You could try to open your spark-ui at localhost:4040, or see the query plan using the explain method:
(
dg
.groupby('cust_xref_id')
.agg(F.sum('spend_active_ind').alias('sum_spend_active_ind'))
.filter(F.col('sum_spend_active_ind') > 0)
).explain()
I have a google timeline chart. I need to highlight the block in which the current date pass through. How can I implement this stuff? This is my code. Her I need to highlight 'F' timeline which is passing through the current date.
function drawChart() {
var container = document.getElementById('Gateways');
var chart = new google.visualization.Timeline(container);
var dataTable = new google.visualization.DataTable();
dataTable.addColumn({ type: 'string', id: 'Room' });
dataTable.addColumn({ type: 'string', id: 'Name' });
dataTable.addColumn({ type: 'date', id: 'Start' });
dataTable.addColumn({ type: 'date', id: 'End' });
dataTable.addRows([
[ '1', 'A', new Date(2011, 3, 30), new Date(2012, 2, 4) ],
[ '1', 'B', new Date(2012, 2, 4), new Date(2013, 3, 30) ],
[ '1', 'C', new Date(2013, 3, 30), new Date(2014, 2, 4) ],
[ '1', 'D', new Date(2014, 2, 4), new Date(2015, 2, 4) ],
[ '1', 'E', new Date(2015, 3, 30), new Date(2016, 2, 4) ],
[ '1', 'F', new Date(2016, 2, 4), new Date(2017, 2, 4) ],
[ '1', 'G', new Date(2017, 2, 4), new Date(2018, 2, 4) ],
[ '1', 'H', new Date(2018, 2, 4), new Date(2019, 2, 4) ],
[ '1', 'I', new Date(2019, 2, 4), new Date(2020, 2, 4) ],
[ '1', 'J', new Date(2020, 2, 4), new Date(2021, 2, 4) ]]);
var options = {
timeline: { showRowLabels: false },
avoidOverlappingGridLines: false
};
chart.draw(dataTable, options);
}
import java.time.LocalDate
case class Day(date: LocalDate, other: String)
val list = Seq(
Day(LocalDate.of(2016, 2, 1), "text"),
Day(LocalDate.of(2016, 2, 2), "text"), // Tuesday
Day(LocalDate.of(2016, 2, 3), "text"),
Day(LocalDate.of(2016, 2, 4), "text"),
Day(LocalDate.of(2016, 2, 5), "text"),
Day(LocalDate.of(2016, 2, 6), "text"),
Day(LocalDate.of(2016, 2, 7), "text"),
Day(LocalDate.of(2016, 2, 8), "text"),
Day(LocalDate.of(2016, 2, 9), "text"),
Day(LocalDate.of(2016, 2, 10), "text"),
Day(LocalDate.of(2016, 2, 11), "text"),
Day(LocalDate.of(2016, 2, 12), "text"),
Day(LocalDate.of(2016, 2, 13), "text"),
Day(LocalDate.of(2016, 2, 14), "text"),
Day(LocalDate.of(2016, 2, 15), "text"),
Day(LocalDate.of(2016, 2, 16), "text"),
Day(LocalDate.of(2016, 2, 17), "text")
)
// hard code, for example Tuesday
def groupDaysBy(list: Seq[Day]): List[List[Day]] = {
???
}
val result =
Seq(
Seq(Day(LocalDate.of(2016, 2, 1), "text")), // Separate
Seq(Day(LocalDate.of(2016, 2, 2), "text"), // Tuesday
Day(LocalDate.of(2016, 2, 3), "text"),
Day(LocalDate.of(2016, 2, 4), "text"),
Day(LocalDate.of(2016, 2, 5), "text"),
Day(LocalDate.of(2016, 2, 6), "text"),
Day(LocalDate.of(2016, 2, 7), "text"),
Day(LocalDate.of(2016, 2, 8), "text")),
Seq(Day(LocalDate.of(2016, 2, 9), "text"), // Tuesday
Day(LocalDate.of(2016, 2, 10), "text"),
Day(LocalDate.of(2016, 2, 11), "text"),
Day(LocalDate.of(2016, 2, 12), "text"),
Day(LocalDate.of(2016, 2, 13), "text"),
Day(LocalDate.of(2016, 2, 14), "text"),
Day(LocalDate.of(2016, 2, 15), "text")),
Seq(Day(LocalDate.of(2016, 2, 16), "text"), // Tuesday
Day(LocalDate.of(2016, 2, 17), "text"))
)
assert(groupDaysBy(list) == result)
I have a list of Day object, and I want to group every 7 days together and the start date can be any day (from Monday to Sunday, I give Tuesday as an example).
Above is the function and expected result for my requirement. I am wondering how can I take advantage of Scala collection API to achieve without tail recursive?
Here's what you can do:
// hard code, for example Tuesday
def groupDaysBy(list: Seq[Day]): Seq[Seq[Day]] = {
val (list1,list2)= list.span(_.date.getDayOfWeek != DayOfWeek.TUESDAY)
Seq(list1) ++ list2.grouped(7)
}
I would recommend taking day as a parameter instead of hardcoding though, so it becomes
// hard code, for example Tuesday
def groupDaysBy(list: Seq[Day], dayOfWeek: DayOfWeek): Seq[Seq[Day]] = {
val (list1,list2)= list.span(_.date.getDayOfWeek != dayOfWeek)
Seq(list1) ++ list2.grouped(7)
}
...
assert(groupDaysBy(list, DayOfWeek.TUESDAY) == result)
Map your list to create a Tuple(GroupKey, value) with GroupKey a value representing a uniq week (year*53 + week_of_the_year) for example. Then you can group on GroupKey