WARN org.springframework.data.cassandra.core.CassandraTemplate Cannot prepare statement - spring-data

I'm using spring-data-cassandra with PreparedStament but i see the above logs should I be concerned?
Here's how i code it.
#Repository
class EmailAccountRepository {
private final PreparedStatement PREP_UPDATE_USER_DTLS;
private final CqlSession sessionTemplate;
#Autowired
EmailAccountLockRepository(CqlSession session) {
this.sessionTemplate = session;
this.PREP_UPDATE_USER_DTLS = this.updateUserDtls(this.sessionTemplate);
}
public void updateUserDetails(String propName, String mailAccount) {
final BoundStatement statement = PREP_UPDATE_USER_DTLS.bind(propName, mailAccount);
return sessionTemplate.execute(statement);
}
private PreparedStatement updateUserDtls(CqlSession cqlSession) {
String cql = "update email_account set prop_name = ? where mail_account = ?";
return ((PreparedStatementCreator)
session -> session.prepare(cql))
.createPreparedStatement(cqlSession);
}
}
then invoking the EmailAccountRepository.updateUserDetails on my restcontroller.
Do i need to change my code or there's a proper class/interface to be able to use PrepareStatement.

Related

mybatis interceptor throw Reflection exception affects cpu performence

I had implement a interceptor of myabtis. but we found a problem, execute interceptor lead to throw so many IllegalAccessException, it affects cpu performence
Shown below is where the problem is, why did not check access permision of feild befor executed code "field.get(target)".
public class GetFieldInvoker implements Invoker {
private final Field field;
public GetFieldInvoker(Field field) {
this.field = field;
}
#Override
public Object invoke(Object target, Object[] args) throws IllegalAccessException {
try {
return field.get(target);
} catch (IllegalAccessException e) {
if (Reflector.canControlMemberAccessible()) {
field.setAccessible(true);
return field.get(target);
} else {
throw e;
}
}
}
#Override
public Class<?> getType() {
return field.getType();
}
}
the intercepor of mine:
#Intercepts({
#Signature(
type = StatementHandler.class,
method = "prepare",
args = {Connection.class, Integer.class})
})
public class SqlIdInterceptor implements Interceptor {
private static final int MAX_LEN = 256;
private final RoomboxLogger logger = RoomboxLogManager.getLogger();
#Override
public Object intercept(Invocation invocation) throws Throwable {
StatementHandler statementHandler = realTarget(invocation.getTarget());
MetaObject metaObject = SystemMetaObject.forObject(statementHandler);
BoundSql boundSql = (BoundSql) metaObject.getValue("delegate.boundSql");
String originalSql = boundSql.getSql();
MappedStatement mappedStatement =
(MappedStatement) metaObject.getValue("delegate.mappedStatement");
String id = mappedStatement.getId();
if (id != null) {
int len = id.length();
if (len > MAX_LEN) {
logger.warn("too long id", "id", id, "len", len);
}
}
String newSQL = "# " + id + "\n" + originalSql;
metaObject.setValue("delegate.boundSql.sql", newSQL);
return invocation.proceed();
}
#SuppressWarnings("unchecked")
public static <T> T realTarget(Object target) {
if (Proxy.isProxyClass(target.getClass())) {
MetaObject metaObject = SystemMetaObject.forObject(target);
return realTarget(metaObject.getValue("h.target"));
}
return (T) target;
}
}
Flame Graph
enter image description here
enter image description here
I need help, how to avoid throw exceptions, is any other way to reslove this problem?
thanks.

Memory Cache .Net Core Not Persisting Values

I have a .NET Core 2.1 application. In Startup.cs configuration method, I use:
services.AddDbContext<ApplicationDbContext>(options =
options.UseSqlServer(Configuration.GetConnectionString("DefaultConnection")));
...
services.AddMemoryCache();
Then in my controller:
public class DropDownListController : Controller
{
private readonly ApplicationDbContext _context;
private readonly IMemoryCache _memoryCache;
private const string ProvidersCacheKey = "providers";
private const string AgenciesCacheKey = "agencies";
public DropDownListController(ApplicationDbContext context, IMemoryCache memoryCache )
{
_context = context;
_memoryCache = memoryCache;
}
}
and in the controller also, the method to get the dropdownlist:
public JsonResult GetProvider()
{
IEnumerable<DropDownListCode.NameValueStr> providerlist;
if (_memoryCache.TryGetValue(ProvidersCacheKey, out providerlist))
{
return Json(providerlist);
}
else
{
MemoryCacheEntryOptions cacheExpirationOptions = new MemoryCacheEntryOptions();
cacheExpirationOptions.AbsoluteExpiration = DateTime.Now.AddDays(30);
cacheExpirationOptions.Priority = CacheItemPriority.Normal;
DropDownListCode um = new DropDownListCode(_context);
var result = um.GetProviderList();
_memoryCache.Set(ProvidersCacheKey, result);
return Json(result);
}
}
When I set a breakpoint on the line:
return Json(providerlist);
I see the ProvidersCacheKey is in the _memoryCache, but it has no value.
What happened to the data?
When I do a Quick Watch on _memoryCache, I can see the DbContext object was destroyed. But how can that be, the code works fine but the cache object does not have the data I saved to it.
Any help would be appreciated.
The method to get providers is:
public IEnumerable<NameValueStr> GetProviderList()
{
var providerlist = (from a in _context.AgencyProvider
where a.Provider == a.AgencyId
select new NameValueStr
{
id = a.Provider,
name = a.Name
});
return providerlist.Distinct();
}
Adding "ToList()" in the calling method worked:
MemoryCacheEntryOptions cacheExpirationOptions = new MemoryCacheEntryOptions();
cacheExpirationOptions.AbsoluteExpiration = DateTime.Now.AddMinutes(30);
cacheExpirationOptions.Priority = CacheItemPriority.Normal;
DropDownListCode um = new DropDownListCode(_context);
var result = um.GetProviderList().ToList();
_memoryCache.Set(ProvidersCacheKey, result);
return Json(result);
All credit goes to Steve Py… Thank you sir!

netty SimpleChannelInboundHandler<String> channelRead0 only occasionally invoked

I know that there are several similar questions that have either been answered or still outstanding, however, for the life of me...
Later Edit 2016-08-25 10:05 CST - Actually, I asked the wrong question.
The question is the following: given that I have both a netty server (taken from DiscardServer example) and a netty client - (see above) what must I do to force the DiscardServer to immediately send the client a request?
I have added an OutboundHandler to the server and to the client.
After looking at both the DiscardServer and PingPongServer examples, there is an external event occurring to kick off all the action. In the case of Discard server, it is originally waiting for a telnet connection, then will transmit whatever was in the telnet msg to the client.
In the case of PingPongServer, the SERVER is waiting on the client to initiate action.
What I want is for the Server to immediately start transmitting after connection with the client. None of the examples from netty seem to do this.
If I have missed something, and someone can point it out, much good karma.
My client:
public final class P4Listener {
static final Logger LOG;
static final String HOST;
static final int PORT;
static final Boolean SSL = Boolean.FALSE;
public static Dto DTO;
static {
LOG = LoggerFactory.getLogger(P4Listener.class);
HOST = P4ListenerProperties.getP4ServerAddress();
PORT = Integer.valueOf(P4ListenerProperties.getListenerPort());
DTO = new Dto();
}
public static String getId() { return DTO.getId(); }
public static void main(String[] args) throws Exception {
final SslContext sslCtx;
if (SSL) {
LOG.info("{} creating SslContext", getId());
sslCtx = SslContextBuilder.forClient().trustManager(InsecureTrustManagerFactory.INSTANCE).build();
} else {
sslCtx = null;
}
EventLoopGroup group = new NioEventLoopGroup();
try {
Bootstrap b = new Bootstrap();
b.group(group).channel(NioSocketChannel.class)
.handler(new LoggingHandler(LogLevel.INFO))
.handler(new P4ListenerInitializer(sslCtx));
// Start the connection attempt.
LOG.debug(" {} starting connection attempt...", getId());
Channel ch = b.connect(HOST, PORT).sync().channel();
// ChannelFuture localWriteFuture = ch.writeAndFlush("ready\n");
// localWriteFuture.sync();
} finally {
group.shutdownGracefully();
}
}
}
public class P4ListenerHandler extends SimpleChannelInboundHandler<String> {
static final Logger LOG = LoggerFactory.getLogger(P4ListenerHandler.class);
static final DateTimeFormatter DTFormatter = DateTimeFormatter.ofPattern("yyyyMMdd-HHMMss.SSS");
static final String EndSOT;
static final String StartSOT;
static final String EOL = "\n";
static final ClassPathXmlApplicationContext AppContext;
static {
EndSOT = P4ListenerProperties.getEndSOT();
StartSOT = P4ListenerProperties.getStartSOT();
AppContext = new ClassPathXmlApplicationContext(new String[] { "applicationContext.xml" });
}
private final RequestValidator rv = new RequestValidator();
private JAXBContext jaxbContext = null;
private Unmarshaller jaxbUnmarshaller = null;
private boolean initialized = false;
private Dto dto;
public P4ListenerHandler() {
dto = new Dto();
}
public Dto getDto() { return dto; }
public String getId() { return getDto().getId(); }
Message convertXmlToMessage(String xml) {
if (xml == null)
throw new IllegalArgumentException("xml message is null!");
try {
jaxbContext = JAXBContext.newInstance(p4.model.xml.request.Message.class, p4.model.xml.request.Header.class,
p4.model.xml.request.Claims.class, p4.model.xml.request.Insurance.class,
p4.model.xml.request.Body.class, p4.model.xml.request.Prescriber.class,
p4.model.xml.request.PriorAuthorization.class,
p4.model.xml.request.PriorAuthorizationSupportingDocumentation.class);
jaxbUnmarshaller = jaxbContext.createUnmarshaller();
StringReader strReader = new StringReader(xml);
Message m = (Message) jaxbUnmarshaller.unmarshal(strReader);
return m;
} catch (JAXBException jaxbe) {
String error = StacktraceUtil.getCustomStackTrace(jaxbe);
LOG.error(error);
throw new P4XMLUnmarshalException("Problems when attempting to unmarshal transmission string: \n" + xml,
jaxbe);
}
}
#Override
public void channelActive(ChannelHandlerContext ctx) {
LOG.debug("{} let server know we are ready", getId());
ctx.writeAndFlush("Ready...\n");
}
/**
* Important - this method will be renamed to
* <code><b>messageReceived(ChannelHandlerContext, I)</b></code> in netty 5.0
*
* #param ctx
* #param msg
*/
#Override
protected void channelRead0(ChannelHandlerContext ctx, String msg) throws Exception {
ChannelFuture lastWriteFuture = null;
LOG.debug("{} -- received message: {}", getId(), msg);
Channel channel = ctx.channel();
Message m = null;
try {
if (msg instanceof String && msg.length() > 0) {
m = convertXmlToMessage(msg);
m.setMessageStr(msg);
dto.setRequestMsg(m);
LOG.info("{}: received TIMESTAMP: {}", dto.getId(), LocalDateTime.now().format(DTFormatter));
LOG.debug("{}: received from server: {}", dto.getId(), msg);
/*
* theoretically we have a complete P4(XML) request
*/
final List<RequestFieldError> errorList = rv.validateMessage(m);
if (!errorList.isEmpty()) {
for (RequestFieldError fe : errorList) {
lastWriteFuture = channel.writeAndFlush(fe.toString().concat(EOL));
}
}
/*
* Create DBHandler with message, messageStr, clientIp to get
* dbResponse
*/
InetSocketAddress socketAddress = (InetSocketAddress) channel.remoteAddress();
InetAddress inetaddress = socketAddress.getAddress();
String clientIp = inetaddress.getHostAddress();
/*
* I know - bad form to ask the ApplicationContext for the
* bean... BUT ...lack of time turns angels into demons
*/
final P4DbRequestHandler dbHandler = (P4DbRequestHandler) AppContext.getBean("dbRequestHandler");
// must set the requestDTO for the dbHandler!
dbHandler.setClientIp(clientIp);
dbHandler.setRequestDTO(dto);
//
// build database request and receive response (string)
String dbResponse = dbHandler.submitDbRequest();
/*
* create ResponseHandler and get back response string
*/
P4ResponseHandler responseHandler = new P4ResponseHandler(dto, dbHandler);
String responseStr = responseHandler.decodeDbServiceResponse(dbResponse);
/*
* write response string to output and repeat exercise
*/
LOG.debug("{} -- response to be written back to server:\n {}", dto.getId(), responseStr);
lastWriteFuture = channel.writeAndFlush(responseStr.concat(EOL));
//
LOG.info("{}: response sent TIMESTAMP: {}", dto.getId(), LocalDateTime.now().format(DTFormatter));
} else {
throw new P4EventException(dto.getId() + " -- Message received is not a String");
}
processWriteFutures(lastWriteFuture);
} catch (Throwable t) {
String tError = StacktraceUtil.getCustomStackTrace(t);
LOG.error(tError);
} finally {
if (lastWriteFuture != null) {
lastWriteFuture.sync();
}
}
}
private void processWriteFutures(ChannelFuture writeFuture) throws InterruptedException {
// Wait until all messages are flushed before closing the channel.
if (writeFuture != null) {
writeFuture.sync();
}
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) {
cause.printStackTrace();
ctx.close();
}
}
/**
* Creates a newly configured {#link ChannelPipeline} for a new channel.
*/
public class P4ListenerInitializer extends ChannelInitializer<SocketChannel> {
private static final StringDecoder DECODER = new StringDecoder();
private static final StringEncoder ENCODER = new StringEncoder();
private final SslContext sslCtx;
public P4ListenerInitializer(SslContext sslCtx) {
this.sslCtx = sslCtx;
}
#Override
public void initChannel(SocketChannel ch) {
P4ListenerHandler lh = null;
ChannelPipeline pipeline = ch.pipeline();
if (sslCtx != null) {
P4Listener.LOG.info("{} -- constructing SslContext new handler ", P4Listener.getId());
pipeline.addLast(sslCtx.newHandler(ch.alloc(), P4Listener.HOST, P4Listener.PORT));
} else {
P4Listener.LOG.info("{} -- SslContext null; bypassing adding sslCtx.newHandler(ch.alloc(), P4Listener.HOST, P4Listener.PORT) ", P4Listener.getId());
}
// Add the text line codec combination first,
pipeline.addLast(new DelimiterBasedFrameDecoder(8192, Delimiters.lineDelimiter()));
pipeline.addLast(DECODER);
P4Listener.LOG.debug("{} -- added Decoder ", P4Listener.getId());
pipeline.addLast(ENCODER);
P4Listener.LOG.debug("{} -- added Encoder ", P4Listener.getId());
// and then business logic.
pipeline.addLast(lh = new P4ListenerHandler());
P4Listener.LOG.debug("{} -- added P4ListenerHandler: {} ", P4Listener.getId(), lh.getClass().getSimpleName());
}
}
#Sharable
public class P4ListenerOutboundHandler extends ChannelOutboundHandlerAdapter {
static final Logger LOG = LoggerFactory.getLogger(P4ListenerOutboundHandler.class);
private Dto outBoundDTO = new Dto();
public String getId() {return this.outBoundDTO.getId(); }
#Override
public void write(ChannelHandlerContext ctx, Object msg, ChannelPromise promise) {
try {
ChannelFuture lastWrite = ctx.write(Unpooled.copiedBuffer((String) msg, CharsetUtil.UTF_8));
try {
if (lastWrite != null) {
lastWrite.sync();
promise.setSuccess();
}
} catch (InterruptedException e) {
promise.setFailure(e);
e.printStackTrace();
}
} finally {
ReferenceCountUtil.release(msg);
}
}
}
output from client
Just override channelActive(...) on the handler of the server and trigger a write there.

Extracting CD tagged tokens using UIMA

I wrote an annotator that extracts all CD tagged tokens and the code looks like this
public class WeightAnnotator extends JCasAnnotator_ImplBase {
private Logger logger = Logger.getLogger(getClass().getName());
public static List<Token> weightTokens = new ArrayList<Token>();
public static final String PARAM_STRING = "stringParam";
#ConfigurationParameter(name = PARAM_STRING)
private String stringParam;
#Override
public void initialize(UimaContext context) throws ResourceInitializationException {
super.initialize(context);
}
#Override
public void process(JCas jCas) throws AnalysisEngineProcessException {
logger.info("Starting processing.");
for (Sentence sentence : JCasUtil.select(jCas, Sentence.class)) {
List<Token> tokens = JCasUtil.selectCovered(jCas, Token.class, sentence);
for (Token token : tokens) {
int begin = token.getBegin();
int end = token.getEnd();
if (token.getPos().equals( PARAM_STRING)) {
WeightAnnotation ann = new WeightAnnotation(jCas, begin, end);
ann.addToIndexes();
System.out.println("Token: " + token.getCoveredText());
}
}
}
}
}
but when I am trying to create an iterator on it in a pipeline, the iterator is returning null. Here is how my pipeline looks.
AnalysisEngineDescription weightDescriptor = AnalysisEngineFactory.createEngineDescription(WeightAnnotator.class,
WeightAnnotator.PARAM_STRING, "CD"
);
AggregateBuilder builder = new AggregateBuilder();
builder.add(sentenceDetectorDescription);
builder.add(tokenDescriptor);
builder.add(posDescriptor);
builder.add(weightDescriptor);
builder.add(writer);
for (JCas jcas : SimplePipeline.iteratePipeline(reader, builder.createAggregateDescription())) {
Iterator iterator1 = JCasUtil.iterator(jcas, WeightAnnotation.class);
while (iterator1.hasNext()) {
WeightAnnotation weights = (WeightAnnotation) iterator1.next();
logger.info("Token: " + weights.getCoveredText());
}
}
I generated WeightAnnotator and WeightAnnotator_Type using JCasGen. I debugged the entire code but I don't understand where I am getting this wrong. Any ideas on how to improve this are appreciated.

spring data mongodb converter

I am using spring data mongo-db 1.4.1.RELEASE.
My entity 'Event' has a getter method which is calculated based on other properties:
public int getStatus() {
return (getMainEventId() == null) ? (elapseTimeInMin() < MINIMUM_TIME ? CANDIDATE :
VALID) : POINTER;
}
I wanted the property 'status' to be persisted only through the getter ,so I wrote converters:
#WritingConverter
public class EventWriteConverter implements Converter<Event ,BasicDBObject > {
static final Logger logger = LoggerFactory.getLogger(EventWriteConverter.class.getCanonicalName());
public BasicDBObject convert(Event event) {
logger.info("converting " +event );
if (event.getMainEventId() != null)
return new BasicDBObject("mainEventId", event.getMainEventId() );
BasicDBObject doc = new BasicDBObject("status",event.getStatus()).
append("updated_date",new Date()).
append("start",event.getS0()).
append("end",event.getS1()).
append("location",event.getLocation()).
;
BasicDBList list = new BasicDBList();
doc.append("access_points",event.getHotPoints());
return doc;
}
#ReadingConverter
public class EventReadConverter implements Converter<BasicDBObject, Event> {
#Inject
HotPointRepositry hotRepositry;
static final Logger logger = LoggerFactory.getLogger(EventReadConverter.class.getCanonicalName());
public Event convert(BasicDBObject doc) {
logger.info(" converting ");
Event event = new Event();
event.setId(doc.getObjectId("_id"));
event.setS0(doc.getDate("start"));
event.setS1(doc.getDate("end"));
BasicDBList dblist = (BasicDBList) doc.get("hot_points");
if (dblist != null) {
for (Object obj : dblist) {
ObjectId hotspotId = ((BasicDBObject) obj).getObjectId("_id");
event.addHot(hotRepositry.findOne(hotId));
}
}
dblist = (BasicDBList) doc.get("devices");
if (dblist != null) {
for (Object obj : dblist)
event.addDevice(obj.toString());
}
event.setMainEventId(doc.getObjectId("mainEventId"));
return event;
}
}
My test mongo configuration is
#Profile("test")
#Configuration
#EnableMongoRepositories(basePackages = "com.echo.spring.data.mongo")
#ComponentScan(basePackages = "com.echo.spring.data.mongo" )
public class MongoDbTestConfig extends AbstractMongoConfiguration {
static final Logger logger = LoggerFactory.getLogger(MongoDbTestConfig.class.getCanonicalName());
#Override
protected String getDatabaseName() {
return "echo";
}
#Override
public Mongo mongo() {
return new Fongo("echo-test").getMongo();
}
#Override
protected String getMappingBasePackage() {
return "com.echo.spring.data.mongo";
}
#Bean
#Override
public CustomConversions customConversions() {
logger.info("loading custom converters");
List<Converter<?, ?>> converterList = new ArrayList<Converter<?, ?>>();
converterList.add(new EventReadConverter());
converterList.add(new EventWriteConverter());
CustomConversions cus = new CustomConversions(converterList);
return new CustomConversions(converterList);
}
}
And my test (using fongo) is
ActiveProfiles("test")
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = MongoDbTestConfig.class )
public class SampleMongoApplicationTests {
#Test
#ShouldMatchDataSet(location = "/MongoJsonData/events.json")
public void shouldSaveEvent() throws IOException {
URL url = Resources.getResource("MongoJsonData/events.json");
List<String> lines = Resources.readLines(url,Charsets.UTF_8);
for (String line : lines) {
Event event = objectMapper.readValue(line.getBytes(),Event.class);
eventRepository.save(event);
}
}
I can see the converters are loaded when the configuration customConversions() is called
I added logging and breakpoints in the convert methods but they do not seems to be
called when I run or debug, though they are loaded .
What am I doing wrong ?
I had a similar situation, I followed Spring -Mongodb storing/retrieving enums as int not string
and I need both the converter AND converterFactory wired to get it working.