Symptoms

The exact error when executing the cron job manually (/usr/bin/php -f /var/www/vhosts/example.com/httpdocs/cron.php) was:

PHP Fatal error:  Allowed memory size of 536870912 bytes exhausted (tried to allocate 7 bytes) in /var/www/vhosts/onlinediscountmarkt.de/httpdocs/lib/Zend/Db/Statement/Pdo.php on line 290

Solution

This error can happen if Magento's cron_schedule table has too many records, coming from the log entries of past cron runs. So, delete all the entries, directly in the database with TRUNCATE TABLE cron_schedule; or using phpMyAdmin or similar.

Now, when calling the cron job again from the command line, it finishes normally, without a crash.

Prevention

It seems that the table could become that full because in "System -> Configuration -> System -> Cron", "History cleanup every" was set to "1440", thinking this was to be set in minutes. But instead, it seems it's set in days. The same for "Success history lifetime" and "Failure history lifetime" there. So better set all three to some meaningful value like "30".

Discussion

This "Allowed memory size exhausted" error also persisted after installing the AOE Scheduler module and removing all scheduled cron tasks, then calling the cron job manually. This hinted to the fact that this error is unrelated to any single of Magento's various cron tasks (so also independent of installed contrib modules), and instead happens in Magento core. (This is Magento 1.5.0.1 by the way.)

Also, from searching the error message on the Internet, it appears that the "Allowed memory size exhausted" problem at this specific code location is a rather generic error that else happens for example when processing too many product records at once (see for example here or here).

This had me look at Magento's cron_schedule table, in this case it had 673498 entries according to AOE scheduler (in the database, even 830,881 rows). These were too many records for Pdo.php to process within the memory limits: in line with other reports of the "Allowed memory size exhaused" error in Pdo.php line 290, the error was here caused by too many records in this cron_schedule table.

This applies for example when you want to add a photovoltaics installation to your campervan, expedition vehicle, garden hut, off-grid house or similar. For having enough electricity year-round from photovoltaics alone, battery size and module size have to be properly dimensioned.

The best tool I found for this is the European Commission JRC's PV potential estimation utility. There, use the last tab "Stand-alone PV".

Note that that the tilting angle of the solar panels is important in winter. Differences of up to ca. 30° from the optimum have no large effect, but above that they get quite important. So having an angle of 0° (flat panels) while you should have an angle of 74° (Germany in winter for example) means you get only about 25% of the power you would get at a 74° angle. You can calculate the exact numbers for this with the SunAngle calculator.

Instructions

These are instructions how to import messages held in KMail 2 into Thunderbird. Versions were KMail 4.10.5 and Thunderbird 17.0.8, but should not matter for this to work.

  1. Install ImportExportTools in Thunderbird.
  2. In Thunderbird, create a folder into which you want your e-mails imported. This can be in "Local Folders", which is recommended, but also in an IMAP account. In the latter case, importing and uploading are done at the same time, so there are two error sources and the second step can not easily be repeated when it fails, without repeating the import step as well.
  3. Right-click on the new folder and select "ImportExportTools -> Import Messages".
  4. In the file selection dialog, go to ~/.kde/share/apps/kmail/mail/<your folder to import>/cur/, select "All Files" in the file filter at the bottom, and select all these messages (simplest by pressing Ctrl+A).
  5. After the import, all messages are marked as unread. Right-click the folder again and select "Mark folder as read" to fix this.

This should import all messages nicely and without errors. You may compare message counts to those in KMail.

Discussion

The above solution seems quite straightforward, but it really was the only working solution I found. Various other solutions are meant to work, but did not:

  • Uploading local e-mails to an IMAP folder in KMail, then downloading again from there in Thunderbird. Should work usually, but in other cases errors in KMail can be triggered. For example, creating a new folder is not possible in KMail if the e-mail server is based on courier.
  • Selecting a bunch of e-mails in KMail and right-clicking on them then selecting "Save as …" gives the option to save them all to one .mbox file. Which can then be imported with "ImportExportTools -> Import Mbox files …" in Thunderbird. However that import often fails, so that no e-mails are imported at all.

Has been nearly a year since the last release. But then again, it's just a for-fun project for me at the moment where I put in some spare hour to relax by indulging into creative technological thinking and some systems engineering. Anyway, here we go:

The new version 0.13 of the EarthOS document has been published. Access it with that given link, or via “Downloads -> Main” in the site menu.

What's it all about? A constantly evolving open content project where I collect and orchestrate existing open hardware and open source projects and own ideas into a system with which you could manage all material and technological requirements of living, from food to water supply, mobility to clothing, health care to Internet connection.

What is it good for? Personally I use it as a framework of thought: I do not implement it all since it's too much of an effort for one person alone, but where applicable I align the designs of stuff I build according to the EarthOS proposals. For example, things I integrate into my mobile home. So I guess it would be great as a combined roadmap that could help all the open projects to coordinate efforts for arriving at a fully open world efficiently and fast. Preferably in our lifetime!

What's new? Too much to count. Really, it's 850 pages now and I do not remember all I changed. Main aspects are however the energy supply design: levels L2 and L3 rely on biomass gasification now. This includes innovative ideas that have never been tried to my knowledge, like a bike with wood gas powered assisted power engine, or a truck where 100% of all energy is used because the 60-70% thermal energy, normally lost via the exhaust and radiator, is used for meaningful work like drying collected biomass and recycling water by multi-stage distillation.

Why are there no images? :S Umh, I admit the document is very much in draft state, also still containing untranslated German sections etc.. It's like a brain dump … and probably shows the nature of my thoughts: many details but not aesthetically pleasing 😀 I am wide open to your proposals how we could make this a great, usable piece of content. In order to make faster progress, we'd either need more people, or some compensation so I can put in more hours. Any ideas? Any feedback on how applicable this is for a small crowdfunding campaign?

Recording a video from Linux desktop content, like for creating a screencast presentation, is quite simple with recordmydesktop or its GUI version gtk-recordmydesktop.

However if you want to capture the system sound as well (not what you might speak live into a microphone), it gets a bit more difficult. Here is one possible solution with the pulseaudio sound server (available by default in Ubuntu Linux):

  1. Install pavucontrol by executing sudo apt-get install pavucontrol.
  2. In gtk-recordmydesktop, go to "Advanced -> Sound -> Device" and change the value from "DEFAULT" to "default".
  3. Start pavucontrol.
  4. Do a test recording with gtk-recordmydesktop and, while it's recording, in pavucontrol go to tab "Recording", there in entry "ALSA plug-in[recordmydesktop]:ALSA capture from" change the value from "Built-in Audio Analog Stereo" to "Monitor of Built-in Audio Analog Stereo".
  5. In tab "Input devices" use the dropdown on the bottom to display also monitor devices, look for the ""Monitor of Built-in Audio Analog Stereo" device and make sure it's not silenced and the volume gauge is at 100%. When something is playing through your speakers, you have to get a signal showing up there, independently of if you're recording at the moment.
  6. Record with gtk-recordmydesktop as you're used to do.

This solution was taken from ubuntuforums.org thread 1509398.

There is also another solution involving only ALSA and the snd-aloop kernel module for recording from a loopback soundcard. However I could not get it to work.

Acquia Drupal Commons 3 is a great Drupal 7 distribution to create a social network type site. Casetracker is a nice, extensible support ticket management system for Drupal 7 that can also be used as a task manager for distributed collaboration on tasks. Here's how to integrate both cleanly as I did for the Edgeryders site:

  1. Create a new Drupal Commons integrated content type as a module. This is really simple by the existing, quite generic Drupal Commons content type modules (commons_post), copying it to another module (here, commons_tasks) and replacing all occurences of post / posts with task / tasks respectively. In my case, the result is a module commons_tasks. Not yet readily packaged as a Drupal module, but you cand download the files from that link already. That's better than creating it yourself, since there are a few other tweaks included (like adapting the link for creating a new task to inclde a reference to the Casetracker project ID as parameter for which to create the task.)
  2. Install the new module. Here: Save the directory commons_tasks under ./sites/all/modules and call: drush pm-enable commons_tasks.
  3. Adapt Casetracker settings. In the "Casetracker settings" screen (at /admin/config/casetracker/settings), set "Group" to be the only project node type and "Task" to be the only case node type.
  4. Add og_group_ref to the new content type. Go to "Administration -> Structure -> Content types -> Task -> Manage fields" (/admin/structure/types/manage/task/fields) and add add the existing field og_group_ref.
  5. Configure field og_group_ref. That is, configure the field and fied display settings to be the same as in the Drupal Commons content types, e.g. Post. Especially take care to enable the "field prepopulate" setting to select proper group membership as default when creating new content from within a group.
  6. Configure permissions for Casetracker. To be done at /admin/people/permissions. Note that you don't need to give any permissions in the "CT Basic" section because these only relate to the Casetracker's default project and case content types, which we don't use.
  7. Disable the casetracker_basic module. It's not needed because we use other content types for Casetracker projects and cases here. Execute: drush pm-disable casetracker_basic.
  8. Adapt comments settings. In the content type settings for "Task" (/admin/structure/types/manage/task), adapt comments accordingly. You probably neither want a title nor threading for the comments.

As a result, all Tasks are now handled by the casetracker module and can also be managed in its task manager at /casetracker. But additionally, tasks are content of organic groups like posts, wikis, polls and questions in Drupal Commons, and are nicely integrated with the Drupal Commons group browsing widget, content creation widget and notifications system, including the "Follow" and e-mail notifications features.

For example, you might be able to read and understand a foreign language website sufficiently using machine translation (like Google Translate), but machine translation is mostly not sufficient for contributing own content to it. So what? The optimum case would be to have a webservice that offers instant translation for small texts. Or translation with a short turnaround time like one hour, after which you can insert your statement into the website or forum. Ideally, such a webservice would be P2P organized: you would earn credits by doing this instant translation for others, and would spend them for getting own translation.

Funny enough, this exact thing does not exist. Here's what I found instead, by adequacy, starting with the best solutions:

  • Fiverr. So far the best solution I could find for cases where you don't need super professional (just understandable) translation quality. You will easily find people offering 500 – 1000 words of translation for 5 USD. Turnaround time can be as fast as 24 hours. Or try to make a special deal with somebody offering translation, language lessons or similar, so that you agree on a time to chat where you can get the translation right away, or agree that you can send multiple e-mails with short texts which the translator will translate immediately or within a day, up to a total word amount.
  • SpeakLike Chat Translation. Indeed they do realtime translation in chats, and by chatting with yourself on two accounts you can of course intercept the translation for putting it into a webpage lateron. Price is about 0.05 EUR/word [source].
  • SOS Translator Chat. They offer live translation by instant messaging with a translator. They charge a start rate plus a rate per minute, not per word [source]. So this is rather difficult to fit in for contributing content to a website.
  • VerbalizeIt Skype-embedded translation. It seems they offer text-to-text realtime translation for Skype chats. However it's a professional service, so it's not cheap, seemingly starting at 0.17 USD/word [source].
  • ackuna. Nice idea: a crowdsourcing site for translating. You translate something for others, and you can get something translated by others. However there is no system where you have to earn points by translating before you can get something translated, so from a first glance it seems that you can't count on your text being translated in reasonable time. It can take months. Also, the site is mostly for short phrases as found in software applications – while it's possible to enter paragraphs of text, this will probably quickly overchallenge the goodwill of the uncompensated volunteers. The site also accepts special file formats for software i18n, but is not limited to that: there's also a textbox to paste the text you want translated [source, at "How do I create a project?"] I have to admit that I don't like the translators' interface of Ackuna too well: it's a good start, but for great usability a lot of tweaks have to be made. Like translating in lists (with tabbing), AJAX voting in lists (saving more page load times), a transactional one-point-per-word system where people have to earn points by translating before posting own projects, showing only entries without any submission while translating, etc..
  • Transfix.it. A software where humans proofread and fix machine-translated texts.
  • WikiTranslation. A site for gratis, community-generated translations which can be voted.
  • OneHourTranslation. Fast turnaround time of at most one hour to start and one hour per 200 words. But mostly too expensive for private use in forums etc. (0.06 EUR/word).

Some other interesting finds about translation include:

  • Linguanaut Free Translation. Translation is done by volunteers. Which means they are of course not compelled to do the job, there is no deadline, and one can't be sure when the job gets done. But for occasional, not time-critical translations it seems great. Not a good idea to exploit volunteer with high volume work, of course.
  • Free human translation at Translatorsbase.com. Only for single words and short phrases, but for these really useful. The translators get higher ranking by providing such free translations.
  • Gengo. Offering people-powered translation services.
  • Babylon Human Translation. By well-known translation software provider "Babylon".
  • TYWI Mobile Interpreter. Offers simultaneus human translation of speech, via a professional translator, connected by Internet.
  • Wikipedia on Telephone intepretation.