user-error: Not in table data field - emacs

The following table
#+BEGIN: clocktable :maxlevel 3 :tcolumns 4 :scope file :block 2015-6 :narrow 60
| Headline | Time | | |
|--------------+--------+------+------|
| *Total time* | *3:57* | | |
|--------------+--------+------+------|
| Tasks | 3:57 | | |
| 1 | | 3:57 | |
| 2 | | | 3:57 |
#+TBLFM: #3$5..#>$5=vsum($2..$4)*100
gives me
user-error: Not in table data field
where it should add the total of all previous columns to into a new column and multiply them by 100. Via http://notes.secretsauce.net/notes/2014/10/01_org-mode-for-invoices.html
I see two solutions a) add a command to TBLFM to add an additional
column b) make clocktable generate the additional column, but I don't
know how to do either.

Here is a quick hack that you could bind to a key that adds the extra column if needed and then recalculates:
(defun maybe-add-column-and-update ()
(interactive)
(save-excursion
(end-of-line)
(if (= (org-table-current-column) 5)
(let ((org-table-fix-formulas-confirm
(lambda (arg) nil)))
(org-table-insert-column)))
(org-table-recalculate 'iterate)))

Related

org-mode review clocked time by multiple tags

I would like to review my clocked time by tags, to answer e.g. how much time did I spend this week on my health, on my work, on a client or on social relationships?
I am using tags, because the items I want to review my clocked time, can be spread over multiple files and hidden in different subtrees. Maybe this is the problem and I need to restructure? E.g. "Write an entry in your diary" should be stored under "notes", but summed into "health" and of course under "notes" there would be also other notes like "finance"...
Any other solution, e.g. using a custom agenda view or categories instead of tags would also be very welcome.
So far I have tried to use org mode clocktable grouped by multiple tags. For using clocktables I was using this test data:
* Take out the trash :private:
:LOGBOOK:
CLOCK: [2021-03-12 Fri 11:24]--[2021-03-12 Fri 11:30] => 0:06
:END:
* Update document for client :client1:
:LOGBOOK:
CLOCK: [2021-03-12 Fri 12:45]--[2021-03-12 Fri 13:30] => 0:45
:END:
* Create my awesome note for work :work:
:LOGBOOK:
CLOCK: [2021-03-13 Sat 11:24]--[2021-03-13 Sat 12:53] => 1:29
:END:
* Fill in timesheet :work:
:LOGBOOK:
CLOCK: [2021-03-12 Fri 11:24]--[2021-03-12 Fri 11:40] => 0:16
:END:
I have found the following solutions, neither seems to work with my system.
Here my problem is perfectly described. I've downloaded the code, it will create a table, but won't show the sums. Unfortunately, that code snipped seems too old, and I am not able to fix it. I have found a a fork of that snipped which gives me this result:
#+BEGIN: clocktable-by-tag :tags ("work" "client1")
| Tag | Headline | Time (h) |
| | | <r> |
|---------+------------+----------|
| work | *Tag time* | *0.00* |
|---------+------------+----------|
| client1 | *Tag time* | *0.00* |
#+END:
Here I found another solution. The author uses a function to format the times, which are then used by orgaggregate. Unfortunately already the first step, doesn't seem to work correctly:
#+BEGIN: clocktable :scope file :maxlevel 3 :tags t :match "work|client1" :header "#+TBLNAME: timetable\n"
#+TBLNAME: timetable
| Tags | Headline | Time | T |
|---------+---------------------------------+--------+--------|
| | *Total time* | *2:30* | #ERROR |
|---------+---------------------------------+--------+--------|
| client1 | Update document for client | 0:45 | #ERROR |
| work | Create my awesome note for work | 1:29 | #ERROR |
| work | Fill in timesheet | 0:16 | #ERROR |
#+TBLFM: $4='(convert-org-clocktable-time-to-hhmm $3)::#1$4='(format "%s" "T")
#+END:
It really shouldn't be that hard, what I would like to achieve. At the moment the best solution I have, is to use multiple tables, one for each tag:
#+BEGIN: clocktable :scope file :maxlevel 3 :match "work"
#+CAPTION: Clock summary at [2022-01-03 Mon 16:55]
| Headline | Time |
|---------------------------------+--------|
| *Total time* | *1:45* |
|---------------------------------+--------|
| Create my awesome note for work | 1:29 |
| Fill in timesheet | 0:16 |
#+END:
#+BEGIN: clocktable :scope file :maxlevel 3 :match "client1"
#+CAPTION: Clock summary at [2022-01-03 Mon 16:55]
| Headline | Time |
|----------------------------+--------|
| *Total time* | *0:45* |
|----------------------------+--------|
| Update document for client | 0:45 |
#+END:
The first solution you found was almost there. It had two issues that gave you the wrong result:
It considered only agenda files, not the current file as an input for the table. That's the reason why you were getting empty results.
To convert minutes to a nicely formatted display, the function org-duration-from-minutes can be used.
With these updates, your test file gave this result to me:
#+BEGIN: clocktable-by-tag :tags ("work" "client1")
| Tag | Headline | Time (h) | |
|---------+-----------------------------------+----------+------|
| work | *Tag time* | 1:29 | |
| | File *test.org* | 1:29 | |
| | . Create my awesome note for work | | 1:29 |
|---------+-----------------------------------+----------+------|
| client1 | *Tag time* | 0:45 | |
| | File *test.org* | 0:45 | |
| | . Update document for client | | 0:45 |
#+END:
You can get a better summary by using :summary t (I personally prefer this option):
#+BEGIN: clocktable-by-tag :tags ("work" "client1") :summary t
| Tag | Headline | Time (h) |
|---------+------------+----------|
| work | *Tag time* | 1:29 |
|---------+------------+----------|
| client1 | *Tag time* | 0:45 |
#+END:
The :scope tag also works, except that I didn't implement scopes tighter than the current file (such as subtree).
You can find the code in a gist, or copy and paste from below:
(require 'org-clock)
(defun clocktable-by-tag/shift-cell (n)
(let ((str ""))
(dotimes (i n)
(setq str (concat str "| ")))
str))
(defun clocktable-by-tag/insert-tag (files params)
(let ((tag (plist-get params :tags))
(summary-only (plist-get params :summary))
(total 0))
(insert "|--\n")
(insert (format "| %s | *Tag time* |\n" tag))
(mapcar
(lambda (file)
(let ((clock-data (with-current-buffer (find-buffer-visiting file)
(org-clock-get-table-data (buffer-name) params))))
(when (> (nth 1 clock-data) 0)
(setq total (+ total (nth 1 clock-data)))
(if (not summary-only)
(progn
(insert (format "| | File *%s* | %s |\n"
(file-name-nondirectory file)
(org-duration-from-minutes (nth 1 clock-data))))
(dolist (entry (nth 2 clock-data))
(insert (format "| | . %s%s | %s %s |\n"
(org-clocktable-indent-string (nth 0 entry))
(nth 1 entry)
(clocktable-by-tag/shift-cell (nth 0 entry))
(org-duration-from-minutes (nth 4 entry))))))))))
files)
(save-excursion
(re-search-backward "*Tag time*")
(org-table-next-field)
(org-table-blank-field)
(insert (org-duration-from-minutes total))))
(org-table-align))
(defun org-dblock-write:clocktable-by-tag (params)
(insert "| Tag | Headline | Time (h) |\n")
(let ((params (org-combine-plists org-clocktable-defaults params))
(base-buffer (org-base-buffer (current-buffer)))
(files (pcase (plist-get params :scope)
(`agenda
(org-agenda-files t))
(`agenda-with-archives
(org-add-archive-files (org-agenda-files t)))
(`file-with-archives
(let ((base-file (buffer-file-name base-buffer)))
(and base-file
(org-add-archive-files (list base-file)))))
((or `nil `file)
(list (buffer-file-name)))
(_ (user-error "Unknown scope: %S" scope))))
(tags (plist-get params :tags)))
(mapcar (lambda (tag)
(clocktable-by-tag/insert-tag files (org-combine-plists params `(:match ,tag :tags ,tag))))
tags)))

How to format table fields as currency in org-mode

I'd like to format fields in an org-mode table as currency
- meaning with currency symbol ($) and commas as thousands separators. I've been using $%.2f to get e.g. $1000.00 but how to get the comma separators e.g. $1,000.00 ? I've RTFM but perhaps I am too dense to get it. Either calc or elisp formula is fine. See sample table below:
| Item | Quantity | Price | Ext |
|----------+----------+--------+----------|
| Widget 1 | 10 | 100.00 | 1000.00 |
| Widget 2 | 5 | 50.00 | 250.00 |
| Widget 3 | 1 | 5.00 | 5.00 |
|----------+----------+--------+----------|
| | | Total | $1255.00 |
#+TBLFM: $4=($2*$3);%.2f::#5$4=vsum(#2..#4);$%.2f
I found no way of doing it consistently, such that you get numbers with thousands separators, and these numbers instead are correctly interpreted for further calculations. So this is not an answer, just to record my research so far.
The following example steals code to format numbers with thousands separators. C-c C-c on the code to define the function, or add to your init file.
Then, the grand total is calculated using elisp, and transformed with the new formatting function.
#+begin_src elisp :results none
(defun my-thousands-separate (num)
"Formats the (possibly floating point) number with a thousands
separator."
(let* ((nstr (number-to-string num))
(dot-ind (string-match "\\." nstr))
(nstr-no-decimal (if dot-ind
(substring nstr 0 dot-ind)
nstr))
(nrest (if dot-ind
(substring nstr dot-ind)
nil))
(pretty nil)
(cnt 0))
(dolist (c (reverse (append nstr-no-decimal nil)))
(if (and (zerop (% cnt 3)) (> cnt 0))
(setq pretty (cons ?, pretty)))
(setq pretty (cons c pretty))
(setq cnt (1+ cnt)))
(concat pretty nrest)))
#+end_src
| Item | Quantity | Price | Ext |
|----------+----------+------------+--------------|
| Widget 1 | 10 | 1001001.00 | 10010010.00 |
| Widget 2 | 5 | 501001.00 | 2505005.00 |
| Widget 3 | 1 | 51001.00 | 51001.00 |
|----------+----------+------------+--------------|
| | | Total | 12,566,016.0 |
#+TBLFM: $4=($2*$3);%.2f::#5$4='(my-thousands-separate (apply '+ '(#2..#4)));N
Note that if you do the same for the row totals, then the comma-separated numbers will not be interpreted correctly for the grand total.
The correct way should be to set the numeric locale and let printf do the trick, but I don't know how to set this for emacs.

Convert between an org-mode table and a table.el table without user interaction

I convert org-mode table to table.el table. For that I select the table:
| Option | Type | Value | Descr |
| -[no]h | bool | yes | Print |
| -[no]versio | bool | no | Print |
| -nice | int | 0 | Set t |
| -[no]v | bool | no | Be lo |
| -time | real | -1 | Take |
| -[no]rmvsbd | bool | yes | Removvirtual |
| sites | | | |
| -maxwarn | int | 0 | Numbe |
| procenerate | | | |
| unsta | | | |
| -[no]zero | bool | no | Set pthout |
| defau error | | | |
| -[no]renum | bool | yes | Renum |
| atomty | | | |
and press C-c ~. org-mode then asks me
Convert table to table.el table? (y or n)
How do I answer y programmically? I read the docs of that defun -- there's not way to do it with prefix arg.
Similar functionality in bash:
echo y | script-which-asks-y-or-n
C-c ~ calls the command org-table-create-with-table.el, which provides a bunch of wrappers around calling org-table-convert. If you want to use this function when you know you are already in an org-mode table, you don't need the wrappers, you just need the two (and probably only one) commands: org-table-align and org-table-convert.
So if you're doing this interactively, you can just call M-x org-table-convert and you're done. This assumes the table is already aligned. You can do this by hand by tabbing from one cell to the next, which triggers table alignment. Or you can do it with a small function:
(defun my-convert-tables ()
"No questions asked, just convert the table"
(interactive)
(org-table-align)
(org-table-convert))
You can do this programmatically as follows. You would replace the function name test1 with org-table-create-with-table.el in your defadvice functions that would be otherwise the same as those below.
Using defadvice to run some code before and after the function, we can save the function bound to the symbol y-or-n-p to a global variable and rebind it to a function that simply returns true. After the function we then restore the original functionality.
(setq save-y-or-n-p nil)
(defadvice test1 (around always-yes)
(fset 'save-y-or-n-p (symbol-function 'y-or-n-p))
(fset 'y-or-n-p (lambda (s) t))
ad-do-it
(fset 'y-or-n-p (symbol-function 'save-y-or-n-p)))
(defun test1 ()
(interactive)
(if (y-or-n-p "Happy? ")
(insert "Happy day")
(insert "Unhappy day")))

Org-mode tables: Exclude columns from export

I have a set of tables in org-mode that I am exporting, but I'd like certain columns used for calculations and consumption by code blocks to be excluded from LaTeX export.
I'm sure I saw a way to do this by specifying a range of columns to export below the table, but I can't find reference to it anywhere on the web so there's a good chance I dreamt it.
Another way to achieve this is by defining a "hidden" column type H in the LaTeX header options of the org file and then use #+ATTR_LATEX: :align llH to indicate that the third column is to be hidden on export (source):
#+LATEX_HEADER: \usepackage{array}
#+LATEX_HEADER: \newcolumntype{H}{>{\setbox0=\hbox\bgroup}c<{\egroup}#{}}
#+ATTR_LATEX: :align llH
|-----+-------+------|
| 2 | 1/2 | junk |
| 4 | 1/4 | junk |
| 8 | 1/2 | junk |
If you are using "Radio Tables" you can do something like
#+ORGTBL: SEND some-name orgtbl-to-latex :skipcols (3)
|-----+-------+------|
| 2 | 1/2 | junk |
| 4 | 1/4 | junk |
| 8 | 1/2 | junk |
See http://www.gnu.org/software/emacs/manual/html_mono/org.html#Radio-tables for all the details.
I believe it may not be possible directly with export via C-c C-e since they offer the same answer at http://comments.gmane.org/gmane.emacs.orgmode/33946 from November 2010.
I use Michael Brand's solution proposed here and cataloged by Derek Feichtinger here (make sure to view the file in raw mode, otherwise the source is hidden by GitHub).
For convenience, I reproduce the code below:
* Exporting tables with some columns hidden
It is desirable to be able and hide columns in exported output. This is often the
case in tables where a lot of computations are done, and where intermediate
results end up in columns that one does not want to end up in the exported document.
This functionality is currently not available by standard org, but since this is Emacs, a simple function
implementing this functionality was published by [[https://github.com/brandm][Michael Brand]] within this [[http://lists.gnu.org/archive/html/emacs-orgmode/2016-05/msg00027.html][emacs-orgmode thread]].
#+BEGIN_SRC emacs-lisp :results silent :exports source
(defun dfeich/org-export-delete-commented-cols (back-end)
"Delete columns $2 to $> marked as `<#>' on a row with `/' in $1.
If you want a non-empty column $1 to be deleted make it $2 by
inserting an empty column before and adding `/' in $1."
(while (re-search-forward "^[ \t]*| +/ +|\\(.*|\\)? +\\(<#>\\) *|" nil t)
(goto-char (match-beginning 2))
(org-table-delete-column)
(beginning-of-line)))
(add-hook 'org-export-before-processing-hook #'dfeich/org-export-delete-commented-cols)
;; (remove-hook 'org-export-before-processing-hook #'dfeich/org-export-delete-commented-cols)
#+END_SRC
The exported table will have col2 removed.
| | col1 | col2 | col3 |
| / | <r> | <#> | |
| | a1 | a2 | a3 |
| | b1 | b2 | b3 |
http://www.gnu.org/software/emacs/manual/html_mono/org.html#The-spreadsheet
3.5.6 Editing and debugging formulas
Use ‘/’ for: Do not export this line. Useful for lines that contain the narrowing ‘’ markers or column group markers.
Note to do this you must use the first column for extra info to do this.
I've found that the most convenient way is to:
make the table not exportable with :noexport: tag,
select the columns I want to be exported with source block, using elisp, and make the result exportable.
Here is an example with elisp.
* Hidden :noexport:
#+NAME: google
| file | total | other | p |
|-----------+-------+-------+----|
| de-01.pdf | 312 | 76 | 76 |
| de-02.pdf | 428 | 101 | 77 |
| de-03.pdf | 1069 | 217 | 80 |
* Exported
Here it comes.
#+begin_src elisp :var data=google :colnames yes
;; select 0th and 3rd column from a table accessible with 'google' name
;; and do some math on it
(mapcar (lambda (e) (list (nth 0 e) (nth 3 e))) data)
#+end_src
#+RESULTS:
| file | p |
|-----------+----|
| de-01.pdf | 76 |
| de-02.pdf | 77 |
| de-03.pdf | 80 |
You can also use any other language that is supported by the source blocks to loop over the results and produce desired output.

Comint Mode Inserts Line Break Every 4096 Characters

Using Emacs 23.2.1 on Ubuntu Lucid, any mode based on Comint inserts occasional line breaks for larger outputs (see example Shell and SQL mode output, below). I've tried this in both SQL Mode and Shell Mode, with the same result in either case. Running similar commands in a plain terminal emulator does not cause these problems (for both shell mode and mysql mode commands).
Things I have tried:
Using MySQL in SQL Mode, adding the following flags: -A, -C, -t, -f, -n, and setting max_allowed_packet to 16MB.
Setting comint-buffer-maximum-size to 10240.
None of these have any effect on this behavior.
If I scroll up to the lines in question and delete the line breaks, the output then appears correctly, so a possible solution to this problem could involve a hook that deletes every 4096th character, if such a thing is possible.
Note: In the terminal examples, the output appears to be cut off at points other than every 4096 characters. In SQL-mode, it is exactly every 4096 (a suspicious number indeed).
Here is some sample output:
brent#battlecruiser:/$ for i in {1..4096}; do echo -n 0; done; echo;
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
In this case, it should print out a single line of 0s, but in fact has inserted a new line character after 904 characters.
Also an Example in SQL Mode using MySQL:
mysql> show variables like '%n%';
+-----------------------------------------+----------------------------------+
| Variable_name | Value |
+-----------------------------------------+----------------------------------+
| auto_increment_increment | 1 |
| auto_increment_offset | 1 |
| binlog_cache_size | 32768 |
| binlog_format | STATEMENT |
| bulk_insert_buffer_size | 8388608 |
| character_set_client | utf8 |
| character_set_connection | utf8 |
| collation_connection | utf8_general_ci |
| collation_database | latin1_swedish_ci |
| collation_server | latin1_swedish_ci |
| completion_type | 0 |
| concurrent_insert | 1 |
| connect_timeout | 10 |
| delayed_insert_limit | 100 |
| delayed_insert_timeout | 300 |
| div_precision_increment | 4 |
| engine_condition_pushdown | ON |
| error_count | 0 |
| event_scheduler | OFF |
| foreign_key_checks | ON |
| ft_boolean_syntax | + -><()~*:""&| |
| ft_max_word_len | 84 |
| ft_min_word_len | 4 |
| ft_query_expansion_limit | 20 |
| general_log | OFF |
| general_log_file | /var/lib/mysql/battlecruiser.log |
| group_concat_max_len | 1024 |
| have_community_features | YES |
| have_dynamic_loading | YES |
| have_innodb | YES |
| have_ndbcluster | NO |
| have_openssl | DISABLED |
| have_partitioning | YES |
| have_symlink | YES |
| hostname | battlecruiser |
| identity | 0 |
| ignore_builtin_innodb | OFF |
| init_connect | |
| init_file | |
| init_slave | |
| innodb_adaptive_hash_index | ON |
| innodb_additional_mem_pool_size | 1048576 |
| innodb_autoextend_increment | 8 |
| innodb_autoinc_lock_mode | 1 |
| innodb_buffer_pool_size | 8388608 |
| innodb_checksums | ON |
| innodb_commit_concurrency | 0 |
| innodb_concurrency_tickets | 500 |
| innodb_data_file_path | ibdata1:10M:autoextend
|
| innodb_data_home_dir | |
| innodb_doublewrite | ON |
| innodb_fast_shutdown | 1 |
| innodb_file_io_threads | 4 |
| innodb_file_per_table | OFF |
| innodb_flush_log_at_trx_commit | 1 |
| innodb_flush_method | |
| innodb_force_recovery | 0 |
| innodb_lock_wait_timeout | 50 |
| innodb_locks_unsafe_for_binlog | OFF |
| innodb_log_buffer_size | 1048576 |
| innodb_log_file_size | 5242880 |
| innodb_log_files_in_group | 2 |
| innodb_log_group_home_dir | ./ |
| innodb_max_dirty_pages_pct | 90 |
| innodb_max_purge_lag | 0 |
| innodb_mirrored_log_groups | 1 |
| innodb_open_files | 300 |
| innodb_rollback_on_timeout | OFF |
| innodb_stats_on_metadata | ON |
| innodb_support_xa | ON |
| innodb_sync_spin_loops | 20 |
| innodb_table_locks | ON |
| innodb_thread_concurrency | 8 |
| innodb_thread_sleep_delay | 10000 |
| innodb_use_legacy_cardinality_algorithm | ON |
| insert_id | 0 |
| interactive_timeout | 28800 |
| join_buffer_size | 131072 |
| keep_files_on_create | OFF |
| key_cache_division_limit | 100 |
| language | /usr/share/mysql/english/ |
| last_insert_id | 0 |
| lc_time_names | en_US |
| license | GPL |
| local_infile | ON |
| locked_in_memory | OFF |
| log_bin | OFF |
| log_bin_trust_function_creators | OFF |
| log_bin_trust_routine_creators | OFF |
| log_queries_not_using_indexes | OFF |
| log_warnings | 1 |
| long_query_time | 10.000000 |
| lower_case_table_names | 0 |
| max_binlog_cache_size | 4294963200 |
| max_binlog_size | 104857600 |
| max_connect_errors | 10 |
| max_connections | 151 |
| max_error_count | 64 |
| max_insert_delayed_threads | 20 |
| max_join_size | 18446744073709551615 |
| max_length_for_sort_data | 1024
|
| max_prepared_stmt_count | 16382 |
| max_sort_length | 1024 |
| max_sp_recursion_depth | 0 |
| max_user_connections | 0 |
| max_write_lock_count | 4294967295 |
| min_examined_row_limit | 0 |
| multi_range_count | 256 |
| myisam_data_pointer_size | 6 |
| myisam_recover_options | BACKUP |
| net_buffer_length | 16384 |
| net_read_timeout | 30 |
| net_retry_count | 10 |
| net_write_timeout | 60 |
| new | OFF |
| open_files_limit | 1024 |
| optimizer_prune_level | 1 |
| plugin_dir | /usr/lib/mysql/plugin |
| profiling | OFF |
| profiling_history_size | 15 |
| protocol_version | 10 |
| query_cache_min_res_unit | 4096 |
| query_cache_wlock_invalidate | OFF |
| rand_seed1 | |
| rand_seed2 | |
| range_alloc_block_size | 4096 |
| read_only | OFF |
| read_rnd_buffer_size | 262144 |
| relay_log_index | |
| relay_log_info_file | relay-log.info |
| rpl_recovery_rank | 0 |
| skip_external_locking | ON |
| skip_networking | OFF |
| slave_net_timeout | 3600 |
| slave_transaction_retries | 10 |
| slow_launch_time | 2 |
| sql_auto_is_null | ON |
| sql_log_bin | ON |
| sql_max_join_size | 18446744073709551615 |
| sql_notes | ON |
| sql_slave_skip_counter | |
| sql_warnings | OFF |
| storage_engine | MyISAM |
| sync_binlog | 0 |
| sync_frm | ON |
| system_time_zone | EDT |
| table_definition_cache | 256 |
| table_open_cache | 64 |
| thread_handling | one-thread-per-connection |
| time_zone | SYSTEM |
| transaction_alloc_block_size | 8192 |
| transaction_prealloc_size | 4096 |
| tx_isolation
| REPEATABLE-READ |
| unique_checks | ON |
| version | 5.1.41-3ubuntu12.10 |
| version_comment | (Ubuntu) |
| version_compile_machine | i486 |
| version_compile_os | debian-linux-gnu |
| warning_count | 0 |
+-----------------------------------------+----------------------------------+
159 rows in set (0.00 sec)
Here the output is always interrupted by a newline at exact intervals of 4096 characters.
In addition to possible solutions, any new ways to find more information about what is happening would be appreciated.
I had similar problems, though my breaks seemed to be at 1024 characters (ah-ha! in version 21_1 this was the case). This wasn't that big a deal for me, but I did write something that properly concatenated the results so I could post-process them. That didn't affect the output though, so it won't be much help.
The root of your problem lies in read_process_output in process.c, which hard codes the 4096:
/* Read pending output from the process channel,
starting with our buffered-ahead character if we have one.
Yield number of decoded characters read.
This function reads at most 4096 characters.
If you want to read all available subprocess output,
you must call it repeatedly until it returns zero.
The characters read are decoded according to PROC's coding-system
for decoding. */
static int
read_process_output (proc, channel)
Lisp_Object proc;
register int channel;
{
// ... snip
int readmax = 4096;
Like you mentioned in your question, a very possible solution to this would be to write a function (call it, clean-up-comint-output-at-4096-chars), and add it to the comint-output-filter-functions. Something like this. Note: untested code.
(add-hook 'comint-output-filter-functions 'clean-up-comint-output-at-4096-chars)
(defun clean-up-comint-output-at-4096-chars (&optional str)
"look for string of 4096 length and remove newline in the buffer"
(let ((magic-block-size 4096))
(save-match-data
(when (= magic-block-size (length str))
;; at the magic block size, look for a newline
(goto-char (point-max))
(when (and (search-backward str nil t)
(progn
(forward-char magic-block-size)
(looking-at "\n")))
(delete-char 1))))))
I have found the solution to this problem. I had put in my configuration file the following code sourced from http://www.emacswiki.org/emacs/SqlMode
(defun sql-add-newline-first (output)
"Add newline to beginning of OUTPUT for `comint-preoutput-filter-functions'"
(concat "\n" output))
(defun sqli-add-hooks ()
"Add hooks to `sql-interactive-mode-hook'."
(add-hook 'comint-preoutput-filter-functions
'sql-add-newline-first))
(add-hook 'sql-interactive-mode-hook 'sqli-add-hooks)
After removing the code (which because it sets the comint-preoutput-filter-functions, affects shell-mode as well), I no longer experience these issues.
My proposed replacement for this code to get the behavior I want (works for me so far):
(defun sql-add-newline-first (output)
"Add newline to beginning of OUTPUT for `comint-preoutput-filter-functions'"
(remove-hook 'comint-preoutput-filter-functions
'sql-add-newline-first)
(concat "\n" output))
(defun sql-send-region-better (start end)
"Send a region to the SQL process."
(interactive "r")
(if (buffer-live-p sql-buffer)
(save-excursion
(add-hook 'comint-preoutput-filter-functions
'sql-add-newline-first)
(comint-send-region sql-buffer start end)
(if (string-match "\n$" (buffer-substring start end))
()
(comint-send-string sql-buffer "\n"))
(message "Sent string to buffer %s." (buffer-name sql-buffer))
(if sql-pop-to-buffer-after-send-region
(pop-to-buffer sql-buffer)
(display-buffer sql-buffer)))
(message "No SQL process started.")))
(defvar sql-mode-map
(let ((map (make-sparse-keymap)))
(define-key map (kbd "C-c C-c") 'sql-send-paragraph)
(define-key map (kbd "C-c C-r") 'sql-send-region-better)
(define-key map (kbd "C-c C-s") 'sql-send-string)
(define-key map (kbd "C-c C-b") 'sql-send-buffer)
map)
"Mode map used for `sql-mode'.")
Essentially, I am adding the hook right before my sql-send-region-better code starts sending output, then inside the hook I am removing the hook again, guaranteeing that it only inserts the one new line that I want.
Here is my implementation of only prepending "\n" once per input:
(defvar sql-last-prompt-pos 1
"position of last prompt when added recording started")
(make-variable-buffer-local 'sql-last-prompt-pos)
(put 'sql-last-prompt-pos 'permanent-local t)
(defun sql-add-newline-first (output)
"Add newline to beginning of OUTPUT for
`comint-preoutput-filter-functions'
This fixes up the display of queries sent to the inferior
buffer programatically. But also adds extra new-line for
interactive commands.
"
(let ((begin-of-prompt
(or (and comint-last-prompt-overlay
;; sometimes this overlay is not on prompt
(save-excursion
(goto-char (overlay-start comint-last-prompt-overlay))
(looking-at-p comint-prompt-regexp)
(point)))
1)))
(if (> begin-of-prompt sql-last-prompt-pos)
(progn
(setq sql-last-prompt-pos begin-of-prompt)
(concat "\n" output))
output)))
(defun le-sqli-setup ()
"Add hooks to `sql-interactive-mode-hook'."
(add-hook 'comint-preoutput-filter-functions
'sql-add-newline-first t t))
(add-hook 'sql-interactive-mode-hook 'le-sqli-setup)
My solution -- add a newline (but then remove the hook to prevent multiple ones breaking up the text). And then re-add the hook at every input prompt.
(defun sql-add-newline-first (output)
"Add newline to beginning of sql OUTPUT, but remove the hook so
that it doesn't output a newline everytime the output cache is
filled."
(remove-hook 'comint-preoutput-filter-functions 'sql-add-newline-first)
(concat "\n" output))
(defun sql-readd-newline-first (ignore)
"Readd the newline putting hook"
(add-hook 'comint-preoutput-filter-functions 'sql-add-newline-first))
(defun sqli-add-hooks ()
"Add the 'suicidal' newline printing hook, and another hook to
respawn it at every input prompt."
(add-hook 'comint-preoutput-filter-functions 'sql-add-newline-first)
(add-hook 'comint-input-filter-functions 'sql-readd-newline-first))
(add-hook 'sql-interactive-mode-hook 'sqli-add-hooks)
Also, in my case I was using postgresql. Which has the nasty habit of putting extra prompts after a multiline query (like database-# database-# database-# | col | col | ) which pushes the column names away. To solve, I eventually did this:
(defun sql-remove-continuing-prompts (output)
(concat "\n" (replace-regexp-in-string "warren_hero[^=()]# " "" output)))
(defun sqli-add-hooks ()
(add-hook 'comint-preoutput-filter-functions 'sql-remove-continuing-prompts))
(add-hook 'sql-interactive-mode-hook 'sqli-add-hooks)