Here at Mendeley we have a hack day every month where our developers (and even the non-techy folk) try to come up with cool and/or useful projects. On one of those days, Carles Pina from our Desktop team thought it would be interesting to play around with the Google Maps API to visualize Mendeley activity around the globe. He took Mendeley Desktop sync apache logs, mapped the IP addresses to each geolocation using python-geoip, and then fed this into the Google Maps API to generate each keyframe.
As you play through the video – which covers about 2.5 days of Mendeley activity – each glowing dot represents a sync event, which then gently fades out over a few frames. The darker shade that moves across the screen shows the time between sunset and sunrise, and you might notice that the activity decreases at night, but that in places like the US East Coast – perhaps unsurprisingly – there are plenty of people burning the midnight oil. It seems that researchers in New York don’t really believe in sleep. Steve Dennis worked on making this beautiful video, with relaxing music provided by The Disconnect , and we wanted to share it with you. It is, after all, a video starring our users and showing how Mendeley is being used by researchers around the world, and around the clock.
We hope you like it, let us know what you think and if you have any suggestions!
Back in 2006, Moshe Pritsker thought to use video technology to capture and transmit the intricacies of life science research, facilitating both the understanding and reproduction of experiments and techniques. This idea of “letting scientists look over each other’s shoulders” led to the launch of JoVE, the Journal of Visualized Experiments, which is peer reviewed and PubMed-indexed. As a scientific journal, it has an editorial board and hierarchical structure, and ensures consistent quality of its video content by maintaining a network of professional videographers spread across major science centres. Scientists from leading institutions participate by submitting video articles that visualize their experiments.
As science advances, processes and tools also become more complex. Procedures and techniques such as growing stem cells are tremendously complicated and difficult to accurately follow with just a set of written instructions, and visiting labs in person can be a very expensive alternative beyond the resources of many researchers. This challenge of poor experiment reproducibility is what JoVE tries to address, claiming that traditional written and static picture-based print journals are no longer sufficient to accurately convey the intricacies of modern research. Translating findings from the bench to clinical therapies rely on the rapid transfer of knowledge within the research community.
This month’s issue features an article by Connors et al of Massachusetts Eye & Ear and Harvard Medical School, who have developed an audio-based virtual environment simulator that uses audio cues and a video game context to build cognitive maps of three-dimensional spaces and help blind people improve their navigation skills. Other videos include a new non-invasive method being developed at the Massachusetts General Hospital and Harvard Medical School for measuring brain metabolism in new-born babies, and a demonstration of how a biopolymer gel derived from polysaccharides found in brown algae can help patients with heart failure.
There are also other companies operating in the scientific video space, but what they offer is a looser user-generated environment. One of the most successful of those is SciVee, which is backed by the Public Library of Science and features videos that sit alongside traditional journal papers.
So is this the new frontier? Are we actually looking at a situation where most researchers will feel comfortable communicating with their peers using video? Has the scientific community truly given its blessing to such new approaches to science communication? We’d love to hear your thoughts.
Search has become such a fundamental part of our daily routine. Everyone uses search tools, everyday. Google, spotlight, file search, etc. There is just too much information to properly organize, memorize and store in a structured fashion. But that is ok.
Mendeley Desktop provides you with a multitude of ways to organize, filter and search your documents. Many of these task are context based, meaning that if you search while looking at your library or a collection in your library, you only get results from the currently selected folder. If you happen to be reading a PDF in Mendeley Desktop, the search tool will show you results only within that paper.
Now, one thing you, and many Mendeley Desktop users, probably don’t know is that you can constrain your search to specific fields such as the Title, Authors and even your own notes. Yes, you can search for the text contained within your notes!
- Go to the search box in the top right-hand corner of Mendeley Desktop
- Click on the little arrow pointing downward and select “Notes”
- Type in your keyword of interest
- You should start seeing your results update in the middle pane in near real-time
Here’s a quick view of the search box in action on Mendeley Desktop (Mac)
How cool is that? We think it’s pretty cool (and useful!).
Here are the previous eight entries in our How-to series:
Seminar streams is a new service that hosts open access lectures and academic talks. They’ve already got some great academic content such as this talk about membrane protein assembly and this one about the inflammation-cancer link via NF-kappa B and HIF. We think open access academic video content is a great addition to what’s available online, joining such efforts as the Journal of Visualized Experiments.
One thing I’d like to see more of from Seminar Streams is a clear indication of which content is available for re-use. Flickr does a good job of this.