I have the following code (about a year old), which i use for working with zipcodes in an application. Today i had to change a value, and now all of a sudden i can't compile it anymore. I get an unexpected newline at the "h" in "hilleroed".
class Application
postal_codes:
regions:
sjaelland:[0..4999]
fyn: [5000..5999]
soenderjylland: [6000..6200]
.concat [6261..6280]
.concat [6600..6899]
.concat [6041..6064]
.concat [6960]
.concat [7000..7079]
.concat [7200]
.concat [7250]
nordmidtjylland: [6900..6999]
.concat [7080..7199]
.concat [7182..7184]
.concat [7190..7190]
.concat [7260..9999]
office_postal:
ringsted: 4100
hilleroed: 3400
kgslyngby: 2800
odense: 5220
haderslev: 6100
esbjerg: 6715
herning: 7400
aalborg: 9200
aarhus: 8210
horsens: 8700
offices:
ringsted: [0..2690]
.concat [2770,2791]
.concat [4000..4030]
.concat [4050..4990]
hilleroed: [2700..2765]
.concat [2980..3670]
.concat [4040]
kgslyngby: [2800..2970]
odense: [5000..5985]
haderslev: [6000..6240]
.concat [6300..6622]
.concat [6630..6640]
.concat [7000..7080]
.concat [7182..7184]
esbjerg: [6261..6280]
.concat [6623]
.concat [6650..6879] # 6880 is herning.. meh
.concat [6881..6893]
.concat [6960]
.concat [7200..7260]
herning: [6880]
.concat [6900..6950]
.concat [6971..6990]
.concat [7270..7280]
.concat [7330..7680]
.concat [7800..7884]
.concat [8800]
.concat [8831]
aalborg: [7700..7790]
.concat [7900..7990]
.concat [8832]
.concat [8970]
.concat [9000..9999]
aarhus: [8000..8270]
.concat [8305..8340]
.concat [8355..8643]
.concat [8660..8670]
.concat [8830]
.concat [8840..8963]
.concat [8981..8990]
horsens: [7100..7173]
.concat [7190]
.concat [7300..7323]
.concat [8300,8350]
.concat [8653..8659]
.concat [8680..8783]
I can't for the life of me figure out what's wrong. Is this a bug or am i missing something? If i paste the code into the online compiler on coffeescript.org i get the same error. How to fix?
EDIT:
I am using coffee-script 1.7.1. It works in 1.6.3.
UPDATE:
This is an unintended bug, see https://github.com/jashkenas/coffee-script/issues/3408
The issue is in the offices object. This code fails:
offices:
ringsted: [0..2690]
.concat [2770, 2791]
.concat [4000..4030]
.concat [4050..4990]
hilleroed: [2700..2765]
.concat [2980..3670]
.concat [4040]
kgslyngby: [2800..2970]
While this passes:
offices:
ringsted: [0..2690].concat [2770, 2791].concat [4000..4030].concat [4050..4990]
hilleroed: [2700..2765].concat [2980..3670].concat [4040]
kgslyngby: [2800..2970]
If you really want to keep the line breaks, add some parens:
offices:
ringsted: ([0..2690]
.concat [2770..2791]
.concat [4000..4030]
.concat [4050..4990])
hilleroed: ([2700..2765]
.concat [2980..3670]
.concat [4040])
kgslyngby: [2800..2970]
It appears that the compiler is confused about where your object definitions end. If you leave out the parens you'll get this js:
// without parens
var _i, _j, _k, _l, _results, _results1, _results2, _results3;
({
offices: {
ringsted: (function() {
_results3 = [];
for (_l = 0; _l <= 233; _l++){ _results3.push(_l); }
return _results3;
}).apply(this)
}.concat((function() {
_results2 = [];
for (_k = 2770; _k <= 2791; _k++){ _results2.push(_k); }
return _results2;
}).apply(this)).concat((function() {
_results1 = [];
for (_j = 4000; _j <= 4030; _j++){ _results1.push(_j); }
return _results1;
}).apply(this)).concat((function() {
_results = [];
for (_i = 4050; _i <= 4990; _i++){ _results.push(_i); }
return _results;
}).apply(this))
});
Which, when run in the node repl, returns:
TypeError: Object #<Object> has no method 'concat'
Which makes sense, based on the js, but doesn't make sense based on the coffee. This might be worth submitting as an issue.
TLDR: use parens when you want function return values to be assigned to a field of an object.
Related
I'm having an issue with JPA and json querying. My JPA is Eclipselink and I use Postgres DB.
My query is
with values as(select id, inputspecifications as spec from process where commercial = True and inputspecifications #> '[{\"type\":\"raster\"}]') select id from values where (spec -> 'platforms' is null or (spec -> 'platforms' -> 'satellites' is not null and (spec -> 'platforms' -> 'satellites' ?& array['310802']))
The query works fine until the array inclusion comparison (last bit). It seems JPA is seeing ?& as a positional argument, as per the fine logs
[EL Warning]: sql: 2022-11-07 10:22:05.336--ServerSession(65586123)--Missing Query parameter for named argument: & "null" will be substituted.
[EL Fine]: sql: 2022-11-07 10:22:05.336--ServerSession(65586123)--Connection(1463355115)--with values as(select id, inputspecifications as spec from process where commercial = True and inputspecifications #> '[{"type":"raster"}]') select id from values where (spec -> 'platforms' is null or (spec -> 'platforms' -> 'satellites' is not null and (spec -> 'platforms' -> 'satellites' ? array['310802']))
bind => [null]
[EL Fine]: sql: 2022-11-07 10:22:05.343--ServerSession(65586123)--SELECT 1
[EL Warning]: 2022-11-07 10:22:05.344--UnitOfWork(446445803)--Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.7.6.v20200131-b7c997804f): org.eclipse.persistence.exceptions.DatabaseException
Internal Exception: org.postgresql.util.PSQLException: ERROR: syntax error at or near "$1"
Position: 293
Error Code: 0
I have tried escaping in various ways, ie \?&, \?\&,... all fail one way or another...
Any idea how to make jpa NOT see ?& as a positional parameter?
Looking for solution on my problem.
Have table that in my example contain only 2 columns.
Column varVersion show version number of our Application.
Column dateLoginTime show when customer last time login to the application.
My first thought was to just create a Max date, order by date and group by varVersion. This seems to work just fine until users start using old application.
On my example you can see that user use version 2.1.3 and than move back to 1.1.8. With max time it look like he is using this version for 6 days. When I looked on the data he use this version only for 5 minutes (mistake).
Max:
varVersion | dateLoginTime
2.1.4 | 2018-03-13 11:31:26.893
1.1.8 | 2018-03-07 16:40:21.060
2.1.3 | 2018-02-28 12:26:52.760
2.1.2 | 2018-02-15 12:35:42.707
1.1.6 | 2018-01-23 15:01:46.410
I'm looking to create new field and see when the app was used from to. But failing to get correct results. Tried min/max/over but still wrong.
Min/Max result:
varVersion |FROM |TO
2.1.4 |2018-02-28 22:45:48.687 |2018-03-13 11:31:26.893
2.1.3 |2018-02-26 12:16:41.907 |2018-02-28 12:26:52.760
2.1.2 |2018-02-14 19:56:11.837 |2018-02-15 12:35:42.707
1.1.8 |2018-01-24 12:19:06.933 |2018-03-07 16:40:21.060
1.1.6 |2018-01-08 16:54:46.780 |2018-01-23 15:01:46.410
Expected Result
version |FROM |TO
2.1.4 |2018-03-07 16:45:10.207 |2018-03-13 11:31:26.893
1.1.8 |2018-03-07 16:40:21.060 |2018-03-07 16:45:10.207
2.1.4 |2018-02-28 22:45:48.687 |2018-03-07 16:40:21.060
2.1.3 |2018-02-26 12:16:41.907 |2018-02-28 22:45:48.687
2.1.2 |2018-02-14 19:56:11.837 |2018-02-26 12:16:41.907
1.1.8 |2018-01-24 12:19:06.933 |2018-02-14 19:56:11.837
1.1.6 |2018-01-08 16:54:46.780 |2018-01-24 12:19:06.933
Anyone have some ideas?
Thanks in advance
Petr
DATA:
--POPULATE DATA FOR TEST
drop table #temp
create table #temp
(varVersion VARCHAR(100),
dateLoginTime DATETIME)
INSERT INTO #temp (varVersion, dateLoginTime)
values
('2.1.4','2018-03-13 11:31:26.893'),
('2.1.4','2018-03-12 11:22:12.650'),
('2.1.4','2018-03-08 08:40:18.133'),
('2.1.4','2018-03-07 16:45:10.207'),
('1.1.8','2018-03-07 16:40:21.060'),
('2.1.4','2018-03-07 12:28:08.823'),
('2.1.4','2018-03-02 12:21:58.583'),
('2.1.4','2018-03-01 12:20:17.163'),
('2.1.4','2018-02-28 22:49:42.320'),
('2.1.4','2018-02-28 22:45:48.687'),
('2.1.3','2018-02-28 12:26:52.760'),
('2.1.3','2018-02-27 12:21:50.887'),
('2.1.3','2018-02-26 12:16:41.907'),
('2.1.2','2018-02-15 12:35:42.707'),
('2.1.2','2018-02-14 19:56:11.837'),
('1.1.8','2018-02-14 12:39:50.603'),
('1.1.8','2018-02-02 12:34:08.393'),
('1.1.8','2018-01-25 12:18:19.790'),
('1.1.8','2018-01-24 12:19:06.933'),
('1.1.6','2018-01-23 15:01:46.410'),
('1.1.6','2018-01-22 12:12:18.510'),
('1.1.6','2018-01-08 16:54:46.780')
--ORIGINAL STATEMENT
SELECT DISTINCT TOP 10
varVersion ,
MAX(dateLoginTime) dateLoginTime--, MAX(dateLoginTime)--, MAX(login_time)
FROM #temp
GROUP BY varVersion
ORDER BY 2 DESC
--NEW STATEMENT
SELECT DISTINCT TOP 10
varVersion ,
MIN(dateLoginTime) 'FROM', MAX(dateLoginTime) 'TO'
FROM #temp
GROUP BY varVersion
ORDER BY 2 DESC
select * from #temp
This should work. Use lag to get the start and then just search above.
declare #T table (ver VARCHAR(10), dt DATETIME);
INSERT INTO #T (ver, dt)
values
('2.1.4','2018-03-13 11:31:26.893'),
('2.1.4','2018-03-12 11:22:12.650'),
('2.1.4','2018-03-08 08:40:18.133'),
('2.1.4','2018-03-07 16:45:10.207'),
('1.1.8','2018-03-07 16:40:21.060'),
('2.1.4','2018-03-07 12:28:08.823'),
('2.1.4','2018-03-02 12:21:58.583'),
('2.1.4','2018-03-01 12:20:17.163'),
('2.1.4','2018-02-28 22:49:42.320'),
('2.1.4','2018-02-28 22:45:48.687'),
('2.1.3','2018-02-28 12:26:52.760'),
('2.1.3','2018-02-27 12:21:50.887'),
('2.1.3','2018-02-26 12:16:41.907'),
('2.1.2','2018-02-15 12:35:42.707'),
('2.1.2','2018-02-14 19:56:11.837'),
('1.1.8','2018-02-14 12:39:50.603'),
('1.1.8','2018-02-02 12:34:08.393'),
('1.1.8','2018-01-25 12:18:19.790'),
('1.1.8','2018-01-24 12:19:06.933'),
('1.1.6','2018-01-23 15:01:46.410'),
('1.1.6','2018-01-22 12:12:18.510'),
('1.1.6','2018-01-08 16:54:46.780');
select tt.ver, tt.dt as frm
, isnull((select min(td.dt) from #T td where td.ver <> tt.ver and td.dt > tt.dt), (select max(dt) from #T)) as too
from ( select t.ver, t.dt, lag(t.ver) over (order by t.dt asc) as lagVer
from #T t
) tt
where tt.ver <> tt.lagVer or tt.lagVer is null
order
by tt.dt desc;
ver frm too
---------- ----------------------- -----------------------
2.1.4 2018-03-07 16:45:10.207 2018-03-13 11:31:26.893
1.1.8 2018-03-07 16:40:21.060 2018-03-07 16:45:10.207
2.1.4 2018-02-28 22:45:48.687 2018-03-07 16:40:21.060
2.1.3 2018-02-26 12:16:41.907 2018-02-28 22:45:48.687
2.1.2 2018-02-14 19:56:11.837 2018-02-26 12:16:41.907
1.1.8 2018-01-24 12:19:06.933 2018-02-14 19:56:11.837
1.1.6 2018-01-08 16:54:46.780 2018-01-24 12:19:06.933
I'm upgrading from Sonar 3.4.1 to 4.1.2 using Oracle 10.2
I've updated all plugins according to the version they are supported in the target version.
After clicking Upgrade button in /setup page I got this error:
2014.07.08 19:15:06 INFO [DbMigration]
2014.07.08 19:15:06 INFO [DbMigration] == AddNetworkUseSubCharacteristic: migrating ============================= ====
2014.07.08 19:15:06 ERROR [o.s.s.ui.JRubyFacade] Fail to upgrade database
An error has occurred, all later migrations canceled:
ActiveRecord::JDBCError: ORA-00904: "DESCRIPTION": invalid identifier
: INSERT INTO characteristics (kee, name, rule_id, characteristic_order, enabled, parent_id, root_id, function_key, factor_value, factor_unit, offset_value, offset_unit, created_at, updated_at, quality_model_id, depth, description, id) VALUES('NETWORK_USE_EFFICIENCY', 'Network use', NULL, NULL, 1, 10347, 10347, NULL, NULL, NULL, NULL, NULL, TO_TI MESTAMP('2014-07-08 19:15:06:825000','YYYY-MM-DD HH24:MI:SS:FF6'), TO_TIMESTAMP('2014-07-08 19:15:06:825000','YYYY-M M-DD HH24:MI:SS:FF6'), NULL, NULL, NULL, ?)
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/connection_adapters/abstrac t_adapter.rb:227:in `log'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/connection_adapters/abstrac t_adapter.rb:212:in `log'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-jdbc-adapter-1.1.3/lib/arjdbc/oracle/adapter.rb:183: in `ora_insert'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/connection_adapters/abstrac t/query_cache.rb:26:in `insert_with_query_dirty'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/base.rb:2967:in `create'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/timestamp.rb:53:in `create_ with_timestamps'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/callbacks.rb:266:in `create _with_callbacks'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/base.rb:2933:in `create_or_ update'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/callbacks.rb:250:in `create _or_update_with_callbacks'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/base.rb:2583:in `save'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/validations.rb:1089:in `sav e_with_validation'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/dirty.rb:79:in `save_with_d irty'
org/jruby/RubyKernel.java:2225:in `send'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/transactions.rb:229:in `wit h_transaction_returning_status'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/connection_adapters/abstrac t/database_statements.rb:136:in `transaction'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/transactions.rb:182:in `tra nsaction'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/transactions.rb:228:in `wit h_transaction_returning_status'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/transactions.rb:196:in `sav e_with_transactions'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/transactions.rb:208:in `rol lback_active_record_state!'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/transactions.rb:196:in `sav e_with_transactions'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/base.rb:727:in `create'
/opt/sonarqube-4.1.2/web/WEB-INF/db/migrate/466_add_network_use_sub_characteristic.rb:42:in `up'
org/jruby/RubyKernel.java:2221:in `send'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/migration.rb:282:in `migrat e'
jar:file:/opt/sonarqube-4.1.2/web/WEB-INF/lib/jruby-complete-1.7.6.jar!/META-INF/jruby.home/lib/ruby/1.8/ben chmark.rb:293:in `measure'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/migration.rb:282:in `migrat e'
org/jruby/RubyKernel.java:2225:in `send'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/migration.rb:365:in `migrat e'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/migration.rb:491:in `migrat e'
org/jruby/RubyProc.java:290:in `call'
org/jruby/RubyProc.java:224:in `call'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/migration.rb:567:in `ddl_tr ansaction'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/migration.rb:490:in `migrat e'
org/jruby/RubyArray.java:1613:in `each'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/migration.rb:477:in `migrat e'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/migration.rb:401:in `up'
/opt/sonarqube-4.1.2/web/WEB-INF/gems/gems/activerecord-2.3.15/lib/active_record/migration.rb:383:in `migrat e'
/opt/sonarqube-4.1.2/web/WEB-INF/lib/database_version.rb:62:in `upgrade_and_start'
/opt/sonarqube-4.1.2/web/WEB-INF/app/models/database_migration_manager.rb:109:in `start_migration'
org/jruby/RubyProc.java:290:in `call'
org/jruby/RubyProc.java:228:in `call'
Not sure why this error related to invalid identifier came up.
Thanks in advance for your help.
CHARACTERISTICS table structure:
{
Name Null? Type
ID NOT NULL NUMBER(38)
KEE VARCHAR2(100)
NAME VARCHAR2(100)
RULE_ID NUMBER(38)
CHARACTERISTIC_ORDER NUMBER(38)
ENABLED NUMBER(1)
PARENT_ID NUMBER(38)
ROOT_ID NUMBER(38)
FUNCTION_KEY VARCHAR2(100)
FACTOR_VALUE NUMBER(30,20)
FACTOR_UNIT VARCHAR2(100)
OFFSET_VALUE NUMBER(30,20)
OFFSET_UNIT VARCHAR2(100)
CREATED_AT TIMESTAMP(6)
UPDATED_AT TIMESTAMP(6)
}
I'm using latest sqlalchemy and latest pymssql from pip to connect mssql server 8.00.2039 (2005?) The difficulty is table and column names are in russian. Is it possible to handle this database with sqlalchemy? At least i have to make 'select ... where' queries.
engine = create_engine("mssql+pymssql://%s:%s#RTBD/rt?charset=utf8" % (settings.RT_USER, settings.RT_PWD), echo = True, encoding = 'utf8')
metadata = MetaData()
metadata.reflect(engine, only = [u"Заказы",])
orders = metadata.tables[u'Заказы']
res = engine.execute(orders.select(orders.c[u'Номер заказа'] == u'14-01-0001'))
Exception is
ValueError Traceback (most recent call last)
<ipython-input-8-50ce93243d1c> in <module>()
----> 1 engine.execute(orders.select(orders.c[orders.columns.keys()[0]] == u'14-01-0001'))
python2.7/site-packages/sqlalchemy/engine/base.pyc in execute(self, statement, *multiparams, **params)
1680
1681 connection = self.contextual_connect(close_with_result=True)
-> 1682 return connection.execute(statement, *multiparams, **params)
1683
1684 def scalar(self, statement, *multiparams, **params):
python2.7/site-packages/sqlalchemy/engine/base.pyc in execute(self, object, *multiparams, **params)
718 type(object))
719 else:
--> 720 return meth(self, multiparams, params)
721
722 def _execute_function(self, func, multiparams, params):
python2.7/site-packages/sqlalchemy/sql/elements.pyc in _execute_on_connection(self, connection, multiparams, params)
315
316 def _execute_on_connection(self, connection, multiparams, params):
--> 317 return connection._execute_clauseelement(self, multiparams, params)
318
319 def unique_params(self, *optionaldict, **kwargs):
python2.7/site-packages/sqlalchemy/engine/base.pyc in _execute_clauseelement(self, elem, multiparams, params)
815 compiled_sql,
816 distilled_params,
--> 817 compiled_sql, distilled_params
818 )
819 if self._has_events or self.engine._has_events:
python2.7/site-packages/sqlalchemy/engine/base.pyc in _execute_context(self, dialect, constructor, statement, parameters, *args)
945 parameters,
946 cursor,
--> 947 context)
948
949 if self._has_events or self.engine._has_events:
python2.7/site-packages/sqlalchemy/engine/base.pyc in _handle_dbapi_exception(self, e, statement, parameters, cursor, context)
1109 )
1110
-> 1111 util.reraise(*exc_info)
1112
1113 finally:
python2.7/site-packages/sqlalchemy/engine/base.pyc in _execute_context(self, dialect, constructor, statement, parameters, *args)
938 statement,
939 parameters,
--> 940 context)
941 except Exception as e:
942 self._handle_dbapi_exception(
python2.7/site-packages/sqlalchemy/engine/default.pyc in do_execute(self, cursor, statement, parameters, context)
433
434 def do_execute(self, cursor, statement, parameters, context=None):
--> 435 cursor.execute(statement, parameters)
436
437 def do_execute_no_params(self, cursor, statement, context=None):
python2.7/site-packages/pymssql.so in pymssql.Cursor.execute (pymssql.c:6057)()
python2.7/site-packages/_mssql.so in _mssql.MSSQLConnection.execute_query (_mssql.c:9858)()
python2.7/site-packages/_mssql.so in _mssql.MSSQLConnection.execute_query (_mssql.c:9734)()
python2.7/site-packages/_mssql.so in _mssql.MSSQLConnection.format_and_run_query (_mssql.c:10814)()
python2.7/site-packages/_mssql.so in _mssql.MSSQLConnection.format_sql_command (_mssql.c:11042)()
python2.7/site-packages/_mssql.so in _mssql._substitute_params (_mssql.c:18359)()
<type 'str'>: (<type 'exceptions.UnicodeEncodeError'>, UnicodeEncodeError('ascii', u'params dictionary did not contain value for placeholder: \u041d\u043e\u043c\u0435\u0440 \u0437\u0430\u043a\u0430\u0437\u0430_1', 57, 62, 'ordinal not in range(128)'))
The query is right and ends with WHERE [Заказы].[Номер заказа] = %(Номер заказа_1)s
But info message from sqla is INFO sqlalchemy.engine.base.Engine {'\xd0\x9d\xd0\xbe\xd0\xbc\xd0\xb5\xd1\x80 \xd0\xb7\xd0\xb0\xd0\xba\xd0\xb0\xd0\xb7\xd0\xb0_1': '14-01-0001'}
The strings \xd0\x9d\xd0\xbe\xd0\xbc\xd0\xb5\xd1\x80 \xd0\xb7\xd0\xb0\xd0\xba\xd0\xb0\xd0\xb7\xd0\xb0_1 and \u041d\u043e\u043c\u0435\u0440 \u0437\u0430\u043a\u0430\u0437\u0430_1 are equal to Номер заказа_1
as stated on the mailing list, FreeTDS and such are very picky about this. The following test works for me but for the poster above it did not work:
UnixODBC 2.3.0
FreeTDS 0.91
Pyodbc 3.0.7
Linux, not OSX, OSX has tons of problems with tds / pyodbc, I’m running on a Fedora 14 machine here
Freetds setting:
[sqlserver_2008_vmware]
host = 172.16.248.142
port = 1213
tds version = 7.2
client charset = UTF8
text size = 50000000
Test script:
# coding: utf-8
from sqlalchemy import create_engine, MetaData, Table, Column, String
e = create_engine("mssql+pyodbc://scott:tiger#ms_2008", echo=True)
#e = create_engine("mssql+pymssql://scott:tiger#172.16.248.142:1213", echo=True)
m = MetaData()
t = Table(u'Заказы', m, Column(u'Номер заказа', String(50)))
m.drop_all(e)
m.create_all(e)
orders = m.tables[u'Заказы']
e.execute(orders.select(orders.c[u'Номер заказа'] == u'14-01-0001'))
part of the output:
CREATE TABLE [Заказы] (
[Номер заказа] VARCHAR(50) NULL
)
2014-03-31 20:57:16,266 INFO sqlalchemy.engine.base.Engine ()
2014-03-31 20:57:16,268 INFO sqlalchemy.engine.base.Engine COMMIT
2014-03-31 20:57:16,270 INFO sqlalchemy.engine.base.Engine SELECT [Заказы].[Номер заказа]
FROM [Заказы]
WHERE [Заказы].[Номер заказа] = ?
2014-03-31 20:57:16,270 INFO sqlalchemy.engine.base.Engine (u'14-01-0001',)
Got a record:
2.1.1 :202 > r.column_data
=> {"data1"=>[1, 2, 3], "data2"=>"data2-3", "array"=>[{"hello"=>1}, {"hi"=>2}], "nest"=>{"nest1"=>"yes"}}
Trying to query inside the array object within data1
2.1.1 :203 > Record.where("column_data -> 'data1' -> 2 = '3' ")
Record Load (0.8ms) SELECT "records".* FROM "records" WHERE (column_data -> 'data1' -> 2 = '3' )
PG::UndefinedFunction: ERROR: operator does not exist: json = unknown
LINE 1: ...* FROM "records" WHERE (column_data -> 'data1' -> 2 = '3' )
^
HINT: No operator matches the given name and argument type(s). You might need to add explicit type casts.
: SELECT "records".* FROM "records" WHERE (column_data -> 'data1' -> 2 = '3' )
ActiveRecord::StatementInvalid: PG::UndefinedFunction: ERROR: operator does not exist: json = unknown
LINE 1: ...* FROM "records" WHERE (column_data -> 'data1' -> 2 = '3' )
^
HINT: No operator matches the given name and argument type(s). You might need to add explicit type casts.
: SELECT "records".* FROM "records" WHERE (column_data -> 'data1' -> 2 = '3' )
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/rack-mini-profiler-0.9.1/lib/patches/sql_patches.rb:160:in `exec'
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/rack-mini-profiler-0.9.1/lib/patches/sql_patches.rb:160:in `async_exec'
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/activerecord-4.0.4/lib/active_record/connection_adapters/postgresql_adapter.rb:791:in `exec_no_cache'
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/activerecord-4.0.4/lib/active_record/connection_adapters/postgresql/database_statements.rb:138:in `block in exec_query'
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/activerecord-4.0.4/lib/active_record/connection_adapters/abstract_adapter.rb:442:in `block in log'
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/activesupport-4.0.4/lib/active_support/notifications/instrumenter.rb:20:in `instrument'
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/activerecord-4.0.4/lib/active_record/connection_adapters/abstract_adapter.rb:437:in `log'
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/activerecord-4.0.4/lib/active_record/connection_adapters/postgresql/database_statements.rb:137:in `exec_query'
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/activerecord-4.0.4/lib/active_record/connection_adapters/postgresql_adapter.rb:908:in `select'
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/activerecord-4.0.4/lib/active_record/connection_adapters/abstract/database_statements.rb:32:in `select_all'
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/activerecord-4.0.4/lib/active_record/connection_adapters/abstract/query_cache.rb:63:in `select_all'
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/activerecord-4.0.4/lib/active_record/querying.rb:36:in `find_by_sql'
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/activerecord-4.0.4/lib/active_record/relation.rb:585:in `exec_queries'
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/activerecord-4.0.4/lib/active_record/relation.rb:471:in `load'
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/activerecord-4.0.4/lib/active_record/relation.rb:220:in `to_a'
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/activerecord-4.0.4/lib/active_record/relation.rb:573:in `inspect'
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/railties-4.0.4/lib/rails/commands/console.rb:90:in `start'
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/railties-4.0.4/lib/rails/commands/console.rb:9:in `start'
from /Users/mmahalwy/.rvm/gems/ruby-2.1.1/gems/railties-4.0.4/lib/rails/commands.rb:62:in `<top (required)>'
from bin/rails:4:in `require'
from bin/rails:4:in `<main>'2.1.1 :204 >
Been struggling with this for quite some time now and no clue how to get it working! Any help would be appreciated please!!
This response requires Postgres 9.4
In your example data structure you have the following:
2.1.1 :202 > r.column_data
=> {"data1"=>[1, 2, 3], "data2"=>"data2-3", "array"=>[{"hello"=>1}, {"hi"=>2}]}
Unfortunately, checking for the existence of an element in an array only works (to my knowledge) with string values. If we had the following data, we could query it without issue.
{"data1"=>['1', '2', '3'], "data2"=>"data2-3"}
Let's test this out. Note: payload is jsonb. It will not work as a json field.
Dynamic.create(payload: {"data1"=>['1', '2', '3'], "data2"=>"data2-3"})
Dynamic.where("payload -> 'data1' ? '1'").first
=> #<Dynamic id: 8, payload: {"data1"=>["1", "2", "3"], "data2"=>"data2-3"}, created_at: "2014-12-24 02:30:31", updated_at: "2014-12-24 02:30:31">
To find out more, you can check out this article