An error occured while starting the X11 server: Failed to activate core devices” Click Quit to quit X11. Click Report to see more details or send a report to Apple.
15/11/12 12:09:51 INFO SparkContext: Running Spark version 1.5.1 Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class at org.apache.spark.util.TimeStampedWeakValueHashMap.<init>(TimeStampedWeakValueHashMap.scala:42) at org.apache.spark.SparkContext.<init>(SparkContext.scala:287) Caused by: java.lang.ClassNotFoundException: scala.collection.GenTraversableOnce$class
ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkDriver@100.64.80.93:57108] has failed, address is now gated for [5000] ms. Reason is: [scala.Option; local class incompatible: stream classdesc serialVersionUID = -114498752079829388, local class serialVersionUID = -2062608324514658839].
根据 scala.Option; local class incompatible 可以发现是 scala 的版本不对,spark 默认的是 scala-2.10,需要改变依赖的scala版本。
改完以后又发现,还是连接不上。本地的输出:
1 2 3 4 5 6
15/11/12 21:46:22 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[appclient-registration-retry-thread,5,main] java.util.concurrent.RejectedExecutionException: Task java.util.concurrent.FutureTask@5430d0ff rejected from java.util.concurrent.ThreadPoolExecutor@7819693b[Running, pool size = 1, active threads = 0, queued tasks = 0, completed tasks = 1] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2047) at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:823) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1369) at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:112)
spark的日志如下:
A
1
15/11/12 21:46:03 ERROR ErrorMonitor: dropping message [class akka.actor.ActorSelectionMessage] for non-local recipient [Actor[akka.tcp://sparkMaster@10.19.27.215:4041/]] arriving at [akka.tcp://sparkMaster@10.19.27.215:4041] inbound addresses are [akka.tcp://sparkMaster@master1:4041]
B
1
15/11/12 22:00:41 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkDriver@100.64.80.93:61812] has failed, address is now gated for [5000] ms. Reason is: [Disassociated].
Yes, this message means that one of the workers tried to contact you using your IP address (10.129.7.154), but Akka is (somewhat stupidly) configured to rely on a DNS name (namely ip-10-129-7-154). If you’ve set up the Spark standalone mode, there was a bug in the scripts where they would use an IP address for the master instead of a hostname.
所以当我将 SMART_IP 改成 ip 而不是 hostname 后,本地终于能连上spark了,设置如下:
MySQL CONCAT function is used to concatenate two strings to form a single string.
MySQL GROUP_CONCAT() function returns a string with concatenated non-NULL value from a group.
数据库Person表的内容如下:
id
name
source
age
1
A
GP
6
2
B
GP
2
3
A
FB
1
4
C
FB
4
5
D
FB
5
6
A
FB
3
7
C
TW
7
1.SQL:
1 2 3 4
select name, count(distinct source) as sourceCount, group_concat(distinct source separator "/") as sources from Person group by name;
Query Result:
name
sourceCount
sources
A
2
GP/FB
B
1
GP
C
2
GP/TW
D
1
FB
2.SQL:
1 2 3 4 5
select name, count(distinct source) as sourceCount, group_concat(distinct source separator "/") as sources from Person group by name having sourceCount = 1 and sources = 'FB';
Query Result:
name
sourceCount
sources
D
1
FB
3.SQL:
1 2 3
select name, count(distinct age) as ageCount, group_concat(age order by age separator "#") as ages from Person;
Query Result:
name
ageCount
ages
A
3
1#3#6
B
1
2
C
2
4#7
D
1
5
2. mysql -N 不显示字段名
普通的查询语句,查询结果中带字段名
1 2
mysql -h xxxx -P 8000 -u'xxx' -p'xxx' -D xxdb -e "select name from Person where name = 'A'";
+—————-+
| name |
+—————-+
| not found |
+—————-+
带-N的查询语句,查询结果中不带字段名
1 2
mysql -N -h xxxx -P 8000 -u'xxx' -p'xxx' -D xxdb -e "select name from Person where name = 'A'";
JVM即时编译器:即时编译器(Just In Time Compiler) 简称JIT JAVA程序最初是通过解释器(Interpreter)进行解释执行的,当JVM发现某个方法或代码块运行特别频繁的时候,就会认为这是“热点代码”(Hot Spot Code)。 为了提高热点代码的执行效率,就会将这些“热点代码”编译成与本地机器相关的机器码,进行各个层次的优化。 完成这个任务的编译器就是即时编译器(JIT)。
While coding if we take care of a few points we can avoid memory leak issue.
Use time out time for the session as low as possible.
Release the session when the same is no longer required. We can release the session by using HttpSession.invalidate().
Try to store as less data as possible in the HttpSession.
Avoid creating HttpSession in jsp page by default by using the page directive
<%@page session=”false”%>
Try to use StringBuffer’s append() method instead of string concatenation. String is an immutable object and if we use string concatenation, it will unnecessarily create many temporary objects on heap which results in poor performance.
For ex. if we write String query = “SELECT id, name FROM t_customer whereMsoNormal” style=”margin-bottom: 0.0001pt;”> it will create 4 String Objects. But if we write the same query using StringBuffer’s append() it will create only one object as StringBuffer is mutable i.e. can be modified over and over again.
In JDBC code, While writting the query try to avoid “*”. It is a good practice to use column name in select statement.
Try to use PreparedStatement object instead of Statement object if the query need to be executed frequently as PreparedStatement is a precompiled SQL statement where as Statement is compiled each time the Sql statement is sent to the database.
Try to close the ResultSet and Statement before reusing those.
If we use stmt = con.prepareStatement(sql query) inside loop, we should close it in the loop.
Try to close ResultSet, Statement, PreparedStatement and Connection in finally block.
在测试内存泄露时,对GC有一些收获
cannot disable java gc
我们不能决定什么时候发生GC。
System.gc() vs GC button in JVisualVM/JConsole As far as I know, Jconsole or any other tool, uses System.gc() only. There is no other option. As everyone know, java tells everyone not to rely on System.gc(), but that doesn’t mean it doesn’t work at all.