Pages

Saturday, November 22, 2014

SC14: Looking back, but forward thinking

As I reflect on SC14, I wanted to share some observations that are bugging me. I’d love to hear your thoughts (itbeth2@gmail.com).

1. Several colleagues who worked ten or more years for public universities have left academia for the commercial sector. Sadly, their intellectual contributions will no longer shape student futures. For this reason, I think everyone should volunteer to support STEM in their local schools.

2. With fewer academics funded to attend SC, has anyone noticed an impact to the technical track offerings? How will federal and university employees keep their skills fresh if they can’t attend conferences? How soon will this skill gap impact the global workforce pipeline? Let’s use this to help frame the case for greater federal travel and conferencing support for STEM activities.

3. When they saw my program committee badge, two vendors complained to me that they swiped fewer badges on the floor this year. One said he captured 2,000 in 2013, but only 1,200 this year. He was worried his company wouldn’t send him to SC15. We need to do whatever we can to bring people into the show floor next year. Open it up to the public, if necessary. Admit local business owners, educators, school groups, and STEM clubs for free.

4. Would the natural competitive element that’s essential (and inherent) to those who function in sales roles erode the platform of collegiality and cross-cultural collaboration that SC is well known for? I’ve worked in sales—I know how they roll.

5. I fear a profit-driven culture will obfuscate the learning process. It’s one thing if a scholar prefers one solution over another, but vendors are paid to support their stuff.

6. There’s an emphasis on entrepreneurship, which is great, but I fear students are ill prepared to market themselves and their innovation. Not everyone is capable of running a business. Maybe it would be helpful to form a Rotarian chapter for STEM entrepreneurs?

7. And last, but never least, we must continue to develop a systemic effort to broaden the participation of underrepresented groups and regions so that innovation is driven by a STEM community that understands the grandest challenges faced by the world we share.

OK, time to pack. Au revoir, ma belle Nouvelle-Orléans!

Wednesday, September 3, 2014

Pilot allows researchers to log in with local campus credentials to access U.S. hosted astrophysics data

Internet2’s InCommon has enhanced its support for international research organizations through a pilot project with the Leonard E. Parker Center for Gravitation, Cosmology and Astrophysics (CGCA) at the University of Wisconsin-Milwaukee, U.S.

The pilot will enable astronomers world-wide to use their local campus credentials to log into three UWM-based services, including astronomers from Laser Interferometer Gravitational Wave Observatory (LIGO), a project to detect and study gravitational waves from astrophysical objects such as black holes and supernovae. The CGCA plays a key role in LIGO, which was the impetus for creating these collaboration services for gravitational wave and other astronomers. By participating in the pilot, CGCA identity management staff are streamlining the access process to these important tools, while simultaneously saving time and effort by not having to create and maintain separate (duplicate) user IDs and passwords for hundreds of researchers worldwide. The new approach will enable researchers to gain immediate access to these resources by simply logging in with the home campus-issued credentials they already have in place.

This map demonstrates the interest in worldwide collaboration. Those countries with national research and education federations participating in eduGAIN are in green, with countries in the process of joining in yellow.






InCommon has previously partnered with CGCA and LIGO to provide secure federated access for researchers at U.S. institutions. By joining the international eduGAIN service, InCommon extends this benefit to researchers in other parts of the world. InCommon participants can make this process even easier by supporting the Research & Scholarship (R&S) program, in which a campus automatically releases a small number of user attributes to all services tagged as R&S. This allows researchers to access a service with little or no intervention from their central campus IT department, while still maintaining full control and being in full compliance with federal, state and campus privacy requirements.

The global InCommon-eduGAIN pilot involves exporting the metadata about these three CGCA services to the international eduGAIN service, which provides trustworthy exchange of information among national research and education federations, like InCommon. The three services include: the Gravitational Wave Astronomy Community Registry; the Gravitational Wave Astronomy Community Wiki; and the Gravitational Wave Astronomy Community List Server. All three services are tagged for R&S.

“We are delighted to pilot the world-wide sharing of CGCA and LIGO services,” said Klara Jelinkova, chair of the InCommon Steering Committee and Senior Associate Vice President and Chief Information Technology Officer at the University of Chicago. “As a community we are indebted to the University of Wisconsin-Milwaukee, and particularly LIGO’s Scott Koranda and Warren Anderson, as fellow innovation pioneers in our international efforts to support research and education.”

Computer simulation of two black holes merging into one, and the release of energy in the form of gravitational waves. Photo credit Bernd Brügmann (Principal Investigator), Max Planck Institute for Astrophysics, Garching, Germany.



About Internet2
Internet2® is a member-owned advanced technology community founded by the nation's leading higher education institutions in 1996. Internet2 provides a collaborative environment for U.S. research and education organizations to solve common technology challenges, and to develop innovative solutions in support of their educational, research, and community service missions.
Internet2 also operates the nation’s largest and fastest coast-to-coast research and education network, in which the Network Operations Center is powered by Indiana University. Internet2 serves more than 93,000 community anchor institutions, 250 U.S. universities, 70 government agencies, 38 regional and state education networks, 80 leading corporations working with our community and more than 65 national research and education networking partners representing more than 100 countries.
Internet2 offices are located in Ann Arbor, Mich.; Denver, Colo.; Emeryville, Calif.; Washington, D.C; and West Hartford, Conn. For more information, visit www.internet2.edu or follow @Internet2 on Twitter.

About InCommon
InCommon®, operated by Internet2®, serves the U.S. education and research communities, supporting a common framework of trust services, including the U.S. identity management trust federation for research and education, a community-driven Certificate Service, an Assurance Program providing higher levels of trust, and a multifactor authentication program. InCommon has more than 600 participants, including higher education institutions and research organizations, and their sponsored partners.

About the University of Wisconsin-Milwaukee

As Wisconsin’s premier public urban institution, UW-Milwaukee enjoys a growing national reputation for excellence in research, teaching and community engagement. On an operating budget of $680 million, it educates nearly 28,000 students and is an engine of innovation for
Southeastern Wisconsin. The 104-acre main campus and satellite sites are located in the economic and cultural heart of the state. The university’s recent expansion includes new academic and research facilities, including the Joseph J. Zilber School of Public Health and the only School of Freshwater Sciences in the United States. For more information, visit http://www4.uwm.edu/ or follow @UWM on Twitter.

About the Leonard E. Parker Center for Gravitation, Cosmology and Astrophysics

The Leonard E. Parker Center for Gravitation, Cosmology and Astrophysics (CGCA) is supported by NASA, the National Science Foundation, UW-Milwaukee College of Letters and Science, and UW-Milwaukee Graduate School. We push the frontiers of astrophysics through the novel use of observation, theory and computation. By bringing together expertise in gravitational physics, astrophysics and computing, CGCA can address scientific challenges in relativistic astrophysics, gravitational-wave astronomy, particle astrophysics, cosmology, and quantum gravity. For more information, visit http://www.gravity.phys.uwm.edu/.


Internet2 Contact: Dean Woodbeck
(906) 523-9620

UWM Contact: Elizabeth Leake
(414)229-3795

Wednesday, July 30, 2014

Why broader engagement (with HPC) matters!

The Supercomputing Conference Broader Engagement (SC-BE) program provides an on-ramp to the high-tech conference experience for minorities, women and people with disabilities. Read why SC13-BE was especially important to Corey Henderson (University of Wisconsin-Madison-US), and his conference mentor, Richard Barrett (Sandia National Laboratories-US). Via STEM-Trek.




Monday, July 28, 2014

Supercomputing and Distributed Systems Camping School 2014, Now in the Coffee Land!


In a spectacular and natural environment, in the heart of the coffee land in Colombia, researchers and students will be held in the 2014 version of the Supercomputing and Distributed Systems Camp 2014. 

Inspired by the idea to meet technology and nature, SC-Camp is an initiative of researchers to offer to undergraduate and master students state-of-the-art lectures and programming practical sessions upon High Performance and Distributed Computing topics. SC-Camp is a non-profit event, addressed to every student including those that lack of financial backup. Last year all students that applyied in due time received a grant to attend lectures, including meals and accomadations. This year the event will be hosted by BIOS Centro de Bioinformática Y Biología Computacional in the outstanding park of Manizales, Colombia.



This year, the summer school is focused in bioinformatics and technology trends to biology and life sciences. SC-Camp 2014 features 6 days of scientific sessions and 1 leisure day with an organized activity (for example, hiking or rafting). During the lectures there will be held several parallel programming practical sessions.
We welcome applications of undergraduate (preferable at last year) or master students from all areas of engineering and science with strong interest upon High Performance and Distributed Computing. Due to the advanced content of lectures some basic notions of Parallel and Distributed Computing along with programming skills are desired. All courses and lectures will be held in English, thus a good knowledge of English -both oral and written- is mandatory. The scientific and steering committee will evaluate the application forms based on the applicant's scientific background and their motivation letter. This year, in a first time, we expect to accept from 80 up to 120 students. The registration fee includes accommodation and access to all scientific lectures. More information and registration in http://www.sc-camp.org

Sunday, June 29, 2014

CHPC: It's time to think wildly different for a change! Let's design an energy-efficient HPC system!

Kruger National Park is home to Africa's Big Five: Lion
Leopard, Cape Buffalo, Rhino, and Elephant

South Africa's Center for High Performance Computing (CHPC) invites everyone to attend their 2014 National Meeting in The Kruger National Park. 


One of the world's largest wildlife reserves will offer the perfect backdrop for this year's program which will focus on the development of HPC systems and applications that leave little or no environmental footprint. Additionally, there's an emphasis on workforce development.


One driver behind CHPC's priorities is the Square Kilometer Array (SKA) project: the most powerful telescope ever designed. The iconic endeavor is being installed in the extraordinarily “radio quiet” Karoo region of South Africa in the Northern Cape Province, and will include remote stations in SKA African partner countries such as Botswana, Namibia, Mozambique, Mauritius, Madagascar, Kenya and Ghana. Cooler-running systems and applications are needed for facilities located in remote, warm climates. Additionally, SKA and the projects that will grow from its roots need an indigenous workforce that's prepared for the future. 


The conference welcomes the contributions and expectations of policy-makers, multidisciplinary research communities, vendors, and academia through a series of contributed and invited papers, presentations and open discussion forums. Pre-conference tutorials will speak to the heart of HPC. The main session will include plenary talks and numerous parallel breakaway sessions. The content will be of interest to participants from all scientific domains, with an over-arching emphasis on the research priorities of stakeholders. 

2013 SADC Forum
Two administrative forums will convene during the conference: the Industrial HPC Advisory Forum and the Southern African Development Community (SADC) Forum. It'll be exciting to learn how far SADC has come since the 2013 meeting where they discussed the development of a shared e-infrastructure for SADC member-states and their collaborators. With several points of presence in sub-Saharan Africa, they are laying the foundation for  a world-class research cyberinfrastructure in a proving ground that holds tremendous opportunity for multinational collaboration, innovation and discovery.  

The South African CHPC Team defended their title
at the ISC14 student cluster challenge
last week in Leipzig, Germany! Go SA!!
The e-infrastructure will not only provide SADC member states with additional computational resources, the community that uses it will lend diversity to the global HPC workforce. At the International Supercomputing Conference in Leipzig last week, the CHPC team won the student cluster challenge for the second year in a row! In December, the next generation will battle it out for a chance to compete at ISC15. 

With the unique location of this conference, space could be limited. Register today for a once-in-a-lifetime opportunity! 

Tuesday, June 24, 2014

Putting the ‘super’ in supercomputing at ISC’14

Image courtesy Tim Krieger/ISC Events.


This week, International Science Grid This Week (iSGTW) is attending the International Supercomputing Conference (ISC’14) in Leipzig, Germany. The event features a range of speakers representing a wide variety of research domains. This includes a fascinating keynote talk given on Monday morning by Klaus Schulten, director of the theoretical and computational biophysics group at the University of Illinois at Urbana-Champaign, US, on the topic of large-scale computing in biomedicine and bioengineering.

A number of high-profile prizes were also awarded on Monday. The ISC award for the best research poster went to Truong Vinh Truong Duy of the Japan Advanced Institute of Science and Technology and the University of Tokyo, Japan. He presented work on OpenFFT, which is an open-source parallel library for computing three-dimensional ‘fast Fourier transforms’ (3-D FFTs).

Meanwhile, both the Partnership for Advanced Computing in Europe (PRACE) and Germany’s Gauss Centre for Supercomputing  awarded prizes for the best research papers. The PRACE award went to a team from the Technical University of Munich and the Ludwig Maximilian University of Munich, Germany, for its work optimizing software used to simulate seismic activity in realistic three-dimensional Earth models. Meanwhile, the GAUSS award went to a team from IBM Research and the Delft University of Technology in the Netherlands for their analysis of the compute, memory, and bandwidth requirements for the key algorithms to be used in the Square Kilometre Array radio telescope (SKA), which is set to begin the first phase of its construction in 2018.

Another source of competition at the event is the announcement of the new TOP500 list of the world’s fastest supercomputers. The new list held little in the way of surprises, with China’s Tianhe-2 remaining the fastest supercomputer in the world by a significant margin. Titan at Oak Ridge National Laboratory in Tennessee, US, and Sequoia at Lawrence Livermore National Laboratory in California, US, remain the second and third fastest systems in the world. The Swiss National Computing Centre’s Piz Daint is once again Europe’s fastest supercomputer and is also the most energy efficient in the top ten. Perhaps the most interesting aspect of Monday’s announcement, however, is the fact that for the second consecutive list, the overall growth rate of all the systems is at a historical low.



Read more in our full feature article in iSGTW next week.

Wednesday, May 21, 2014

EGI Federated Cloud launched at community event in Helsinki

The EGI Community Forum 2014 was hosted at the University of Helsinki.
This week, iSGTW is attending the European Grid Infrastructure (EGI) Community Forum 2014 in Helsinki, Finland. So far, the event has seen the launch of the EGI Federated Cloud, as well as a range of exciting presentations about how grid and other e-infrastructures are being used to advance excellent science in Europe.

The EGI Federated Cloud has been created to support development and innovation within the European Research Area and was designed in collaboration with a wide range of research communities from across the continent. Built on the experience of supporting scientists’ IT needs for over ten years, the EGI Federated Cloud provides researchers with a flexible, scalable, standards-based cloud infrastructure.

“The Federated Cloud is the next step in evolution for EGI,” says EGI.eu managing director Yannick Legré. “We have to support the researchers and understand their needs so we can engage, grow and innovate together.” At launch, the EGI Federated Cloud offers 5,000 cores and 225 terabytes of storage. However, this is set to increase to 18,000 cores and 6,000 terabytes before the end of this year. Legré also recently revealed to iSGTW that EGI has the goal of ramping this up further to 10,000,0000 cores and 1 exabyte (1,000,000 terabytes) of storage by 2020.

This ambitious vision was reiterated during yesterday’s launch by David Wallom, chair of EGI’s Federated Clouds Task Force. “I am delighted to be able to announce that after so much hard work from everyone involved we now have a research-orientated cloud platform based on open standards that is ready to support every researcher in Europe,” says Wallom. “This is an important milestone for all areas of research in Europe.”

Another highlight of the first days of the EGI Community Forum was a speech given by Thierry van der Pyl, director of ‘excellence in science’ in the European Commission Directorate General for Communications Networks, Content, and Technology (DG CONNECT). “Today, science itself is being transformed: All disciplines are now becoming computational, with more and more data to be processed,” says Van der Pyl. “E-infrastructures are part of the digital revolution transforming our society, re-inventing industry, and changing science.”

During his talk, Van der Pyl also praised EGI for the progress it has made over the last decade: “I would like to congratulate the EGI community for its achievements in building a truly European infrastructure — I think this is a remarkable result.”

Be sure to follow iSGTW on TwitterFacebook, and Google+ for further updates from the event under the hashtag #EGICF14. We'll also have a full roundup of the event in our 28 May issue.


- Andrew Purcell is editor-in-chief of International Science Grid This Week (iSGTW).

Sunday, April 27, 2014

Apply by June 15 for the SC14 Broader Engagement (BE) program

SC13 BE Scholars
Every November, more than 10,000 of the world’s leading experts in high performance computing (HPC), networking, data storage and analysis convene for the international Supercomputing (SC) conference, the world’s largest meeting of its kind. To help increase the participation of individuals who have been traditionally under-represented in HPC, the SC conference sponsors the Broader Engagement (BE) program. SC14 will be held Nov. 16-21, 2014 in New Orleans. Complete conference information can be found at: http://sc14.supercomputing.org.

Applications are now being accepted for the SC14 BE program, which offers special activities to introduce, engage and support a diverse community in the conference and in HPC. Competitive grants are available to support limited travel to and participation in the SC14 Technical Program. Consideration will be given to applicants from groups that traditionally have been under-represented in HPC, including women, African Americans, Hispanics, Native Americans, Alaska Natives, Pacific Islanders and people with disabilities. We encourage applications from people in all computing-related disciplines—from research, education and industry.

If you don’t need support but would like to participate, please register to attend the BE workshop or volunteer to serve as a mentor in BE’s Mentor-Protégé Program to support newcomers to the conference.

Questions? Contact be@info.supercomputing.org.

To apply, visit: https://submissions.supercomputing.org/

Friday, April 11, 2014

Call for Papers for CARLA 2014 Conference - First HPCLaTAM - CLCAR Joint Conference in Valparaiso, Chile.


Call for Papers - CARLA 2014: http://www.ccarla.org/
Latin America High Performance Computing Conference 
First HPCLATAM – CLCAR Joint Conference
Organized by CCTVal (USM) & NLHPC (UCHILE)
20 - 22 October 2014, Valparaíso, Chile

IMPORTANT DATES
Paper submission deadline: 15 June 2014
Notification of acceptance: 15 August 2014
Camera-ready papers due: 15 September 2014
Conference dates: 20 - 22 October 2014

AIMS & SCOPE
Building on the success of the previous editions of the HPCLATAM and the CLCAR Conferences, in 2014 both major HPC Latin-American workshops will joint in CARLA 2014, to be held in Valparaíso Chile. This conference also includes the Third High Performance Computing School-ECAR 2014 (from 13-17 October, 2014).

The main goal of the CARLA 2014 conference is to provide a regional forum fostering the growth of the HPC community in Latin America through the exchange and dissemination of new ideas, techniques, and research in HPC. The conference will feature invited talks from academy and industry, short- and full-paper sessions presenting both mature work and new ideas in research and industrial applications. Suggested topics of interest include, but are not restricted to: 
- Parallel Algorithms and Architectures
- Parallel Computing Technologies
- High Performance Computing Applications
- Distributed systems
- GPU Computing: Methods, Libraries and Applications
- Grid and Cloud Computing
- Data Management and Visualizations and Software Tools
- Tools and Environments for High Performance System Engineering

INSTRUCTIONS FOR AUTHORS
Authors are invited to submit full and short papers written in English, according to the Springer LNCS style (http://www.springer.com/computer/lncs?SGWID=0-164-6-793341-0). Full papers must have between 10 to 15 pages and short papers must have between 4 to 6 pages. Papers must not have been previously published or submitted elsewhere. Submissions will be handled electronically using the Easy Chair system (https://www.easychair.org/conferences/?conf=carla2014). All paper submissions will be carefully reviewed by at least two experts and returned to the author(s). The authors of accepted papers must guarantee that their paper will be presented at the conference. 
All accepted full papers will be included in the CARLA 2014 proceedings that will be published in the serie CCIS of Springer: Communications in Computer and Information Science (http://www.springer.com/series/7899). In addition, authors of accepted full papers will be invited to submit extended versions to a Special Issue of Cluster Computing (Impact factor: 0.776):http://www.springer.com/computer/communication+networks/journal/10586
All accepted short papers will be published electronically in the “Book of Ongoing Works”. 


Tuesday, February 25, 2014

Cloud for a smart economy and smart society at Cloudscape VI


Cloudscape VI came to a close just moments ago. The event, which was held at the Microsoft Centre in Brussels, Belgium, featured much discussion of the legal aspects of cloud computing, as well as the potential benefits of cloud computing for both business and scientific research. Bob Jones, head of CERN openlab, also gave an important update on Helix Nebula at the event and we’ll have full coverage of this, as well as the rest of the conference highlights, in next week’s issue of iSGTW.


Ken Ducatel, head of software and services, cloud computing at the European Commission, spoke at Cloudscape VI about the variety of business models still evolving around cloud computing. “There are a lot of business models and they’re very complex: there’s no one-size fits all solution,” he says.


Meanwhile, Linda Strick, coordinator of the Cloud for Europe project, spoke at the event about how the different nation states which exist in Europe can make the provision of public services via the cloud particularly difficult.  “We need to initiate dialogues between public sector and industry, and address concerns on data protection, security, legal, and contractual aspects,” says Strick.


In addition, the environmental impacts of cloud computing were discussed at the event. “ICT is enabling energy reduction through optimization, but ICT also consumes a lot of energy,” says Kyriakos Baxevanidis, deputy head of the European Commission’s Smart Cities and Sustainability unit within DG CONNECT. “If the cloud were a country, it would rank fifth in the world in terms of energy consumption.”


Other highlights of the event included a brief presentation from Simon Woodman from the University of Newcastle, UK, on e-Science Central, as well as information given on the EU Brazil Cloud Connect project, which kicked off earlier this month. You can read more about e-Science Central in our in-depth interview, here, and we’ll have more on EU Brazil Cloud Connect in next week’s issue of iSGTW.


Finally, it was announced at the event that that the popular Cloudscape conference series will soon be heading away from Brussels for the first time, with the first of Cloudscape Brazil conference set to be held later this year. Again, we’ll have more on this in next week’s issue…

Thursday, August 29, 2013

A Latin America Collage in High Performance and Large Scale Computing

Speakers and Contributions in CLCAR 2013
CLCAR 2013 in San José Costa Rica, shown an interesting "collage" of the High Performance and Large Scale Computing activity in the continent. Diversity of users, important scientific contributions and one thing in common: collaboration.

Collaboration among all scientific and academic institutions allows to develop new ideas and proposals. More and more the interaction Europe-LatinAmerica and North America-LatinAmerica is open. 

Some of the contributions in this version of CLCAR are addressed to global interests and to resolve some technical problems and open questions in related areas. In Costa Rica, this year, inspired by the most green country in the continent, Bioinformatics and biodiversity are the common subjects in the most part of the projects.

On the other hand, researchers and HPC Latin American community have meet supported by RedCLARA and some european institutions as CETA CIEMAT and Barcelona Supercomputing Center (BSC) to continue with the development of the Advanced Computing Service for Latin America and Caribbean, SCALAC (from spanish/portuguese acronym). This important meeting is the second face to face journey to address a large scale -advanced computing facility today, with the experience of the past projects EELA, EELA-2 and GISELA.

CLCAR 2013 continues until tomorrow. GridCast and ISGTW are from 2007 media partners of this important activity in Latin America.

Tuesday, August 27, 2013

PURA VIDA from Costa RICA: Starting CLCAR 2013 with Tutorials

CLCAR 2013 in San José Costa Rica
Pura vida is the traditional expression from many "good" thinks in Costa Rica. Acknowledgement, friendship, accordance, happiness... 

This year, the tutorial sessions of CLCAR 2013 began CLCAR with many "good topics": exploiting GPGPU architectures with OpenCL and CUDA, the BioCLCAR tutorial in HPC for biosciences and BONIC with LEGION tutorial. 

Attendees of CLCAR 2013 Tutorials 
Students and instructors from different countries of Latin America, joined to share knowledge in technologies, methodologies and collaboration experiences.  Starting from beginners students in large scale architectures to finish there are some minutes with a scratch level in HPC opportunities.

Tomorrow the version 2013 of CLCAR continues with the second day of the BioCLCAR tutorial and other specific subjects related with the exploitation of large scale architectures, e-science and collaboration. From 2007, the CLCAR is the "Conference" of high performance and scientific computing science of America Laitina.

The first day of CLCAR was a "PURA VIDA" day around collaboration, friendship and e-science".

Saturday, July 27, 2013

A spotlight on the gut at XSEDE'13


Possibly one of the most literally ‘in-depth’ talks I’ve attended at a computing conference came from Larry Smarr of the J. Craig Ventnor institute. He has got involved in biomedical research in the most personal way possible – by presenting his microbiome, or microbial profile, to be scrutinised in minute detail by researchers. The body has 10 times as many microbe cells as human cells. In fact, 99% of the DNA in your genome is in microbial cells not in human cells. “Topologically we’re a torus”, said Smarr. “Effectively, the bugs are all on the outside of your body, including your guts.”

Smarr’s interest in the microbiome started with increasingly severe but misdiagnosed stomach problems. Smarr was not impressed with the guesswork involved in treating his symptoms. He thought DNA sequencing should give a much more accurate picture of what was going on, eventually leading to a diagnosis of early Crohn's disease, an inflammatory bowel condition. With the cost of sequencing a genome fallen from 4 billion dollars per genome, to 4000 dollars, financing the research is not so much the issue – it’s picking out the microbial from the human, and then identifying what you have left. Research using the Gordon supercomputer at SDSC still found a significant proportion of DNA that was ‘unknown’. 

What is clear though is that Crohn’s disease has a devastating effect on the gut bacteria – the equivalent to a ‘mass extinction event’ in Smarr’s words. The traditional medical approach of waging war on the gut microbes using heavy duty drugs was not going to help – the better approach is to switch from full frontal attack to ‘gardening’. This means using new therapeutical tools to manage the microbiome and encourage ‘good’ bacteria. Diet is also an important factor. “Change the feedstock and you’ll change the shape of the garden,” advised Smarr. As he’s still donating all sorts of raw materials to the research programme on a regular basis, he should know.

(You can read more about microbial gardening on the CNN website – even the Economist has got in on the act with its cover page article ‘Microbes Maketh the Man’)

Biosciences Day at XSEDE’13 closed with a lively panel session featuring speakers from a range of high tech biomedical companies, including at least one lawyer. This is not as strange as it first sounds, because many of the issues affecting biomedical research come down to the ethics of sharing very personal data. “If you surprise a bioethicist with a question, the answer is always no” said Glen Orter of Dell. Alex Dickinson of Illumina Inc looked at the role of cloud computing in genomics, including two killer apps – whole genome assembly and variant calling, as well as single click sharing of very large data sets. His wish lists included cloud bioinformatics featuring openness and collaboration. “Interpretation is now the key, since analysis is no longer the bottle neck,” said Dickinson. This means thinking big and looking at phenotypes (how genes are expressed in the body) not just genotypes. “We want whole genomes and lots of them,” he announced.

Donald Jones of Qualcomm life talked about connected data, from wherever it orginates, inside the body, outside the body or from instruments. To bring in a Star Trek reference, this is the ‘tricorder challenge' to find simple to use multimeters for monitoring the body, like the blood glucose meters already used by diabetics. In the future, we’re likely to see increasing numbers of health related apps.
 
Darryl Leon from Life Technologies advised us to think holistically and also address the environmental costs of intensive computing – can we have low energy high energy computing? Nicholas Schork from the Scripps Institute said that sequencing genomes might be cheap at a few thousand dollars but interpretation can cost many times that amount. There are 4-6 million variants per person to explore using algorithms. “This is computationally intensive, and they may not do anything in the end or contribute to disease,” said Schork. There are a host of so-called idiopathic diseases where the cause is still unknown. Increasingly, the “exposome” will become important as well as the basic genome – the exposome is everything you’re exposed to in your life, including environmental factors. The more data we share on the exposome, theoretically the more information can be gleaned to lead to new treatments

The panel speculated that we might eventually see the equivalent of Facebook for sharing personal medical data – would you share your genome as blithely as as your holiday photos? But what about genetically identical twins where only one wants to publish their data, and the other prefers privacy?  Data might be compressed and encrypted – but this doesn't necessarily offer a full protection. Some people with Crohn's, like Larry Smarr, publish all their data in the hope of helping to find a cure. But others want total privacy – the challenge for the future will be to find a way to tread both these paths.
 

Biosciences Day at XSEDE'13

After the excitement of the XSEDE’13 kick-off next door to Comic Con, and the glamour of a 1940’s themed Chair’s dinner aboard the USS Midway, we moved into Biosciences Day. Touring the Midway, we squeaked across the blue and white checked lino of the mess deck (hearty hot meals were served 23 hours out of every 24) and ducked into the cramped confines of the sick bay with its three tier stacked bunk beds. The operating theatre and dental surgery were right next door to the machine shop. To the untrained eye, the equipment in all three looked broadly similar. (Given the gender bias of the crew, I’ll leave the identity of the two most requested elective operations to your imagination. They weren’t on the teeth). The basic nature of the medical treatment on offer to the crew however was a timely reminder of how far medicine has come in the last half century, particular now that high performance computing has joined the medic’s armoury.

David Toth, a Campus Champion at the University of Maryland Washington, talked us through the role XSEDE resources have played in finding potential drug treatments for histoplasmosis (a fungal infection), tuberculosis and HIV. In the days of the USS Midway, crew with a positive HIV test were immediately air lifted to shore, with a future behind a desk rather than at sea ahead of them. Toth’s group screened 4.1 million molecules using XSEDE resources, a task that would have taken 2 years on a single desktop PC. Some drug candidates are now being tested in the wet lab, with several promisingly showing the power to inhibit cell growth. “Supercomputers do a great job of outpacing the biologists,” said Toth. One enterprising feature of the work was that students each screened 10,000 molecules as part of their course, with one student striking lucky with a potential drug candidate.

Looking at biomedical challenges more broadly, Claudiu Farcas from the University of California in San Diego summarised some of the issues posed by data. For a start, there are many different kinds of data, from the genotype, RNA and proteome, on to biomarkers and phenotypes, right up to population level data, all with their own set of acronyms. Public repositories are often poorly annotated and are mostly unstructured as well as governed by limited and complicated data use agreements. “Researchers are encouraged to share but are not always enabled to do so,” warned Farcas.

A particularly thorny issue for biomedical data analysis is how to deal with sensitive personally identified information (PII). Researchers need to protect the privacy of patients by anonymising their data. It also needs to be cleaned up, compressed and aggregated so it can be analysed efficiently. But how best to do this? Bill Barnett, of Indiana University said earlier that biologists really don’t care what they use, as long as it works. Cloud computing can be tempting, but institutional clouds are often still at the early stages of being set up, and commercial sources might have “questionable” privacy protection.

The iDASH (Integrating Data for Analysis, Anonymization and SHaring) portal allows biologists to focus on the biology, not the data. It includes add-ons such as anonymisation of data, annotation, processing pipelines, natural language processing, and the possibility to preferentially select speedier GPGPU resource. According to Farcas, iDASH offers a secure infrastructure plus a platform for development, services and software, including privacy protection.

Wednesday, July 24, 2013

XSEDE’13 in San Diego – when super heros met supercomputers

There aren’t too many conferences where you meet Batman, Dr Who and the Wicked Witch of the West while still checking into the hotel. XSEDE’13 in San Diego this year overlapped with the famous Comic Con event right next door – so caped super heros marching past in the lobby was apparently part of the deal. Comic Con attracts over 120,000 participants every year, XSEDE slightly fewer at 700. But this is an ever rising number year on year, as project leader John Towns was pleased to point out. And I have a suspicion that the categories of ‘comic book nerds’ and ‘IT geeks’ are perhaps not entirely mutually exclusive sets…

XSEDE, the Extreme Science and Engineering Discovery Environment, is a National Science Foundation-supported project that brings together supercomputers, data collections, and computational tools and services to support science and engineering research and education. The annual XSEDE conference focuses on the science, education, outreach, software, and technology used by the XSEDE and other related communities worldwide.

The programme kicked off with a day of well attended tutorials – with a 200-strong host of students at the event, the training opportunities were well appreciated, as was the opportunity to showcase their work in poster sessions and competitions. Even younger participants were catered for by a morning robotics class, “Ready, Set, Robotical” which I was more than tempted to join.

Training is always a strong theme at XSEDE, and the challenges of providing online courses in parallel computing were discussed, as well as developing undergraduate and graduate programmes in computational science. Campus champions are the backbone of outreach on a regional basis, and XSEDE is now looking to expand the scheme into subject specific champions. This emerged as a theme for future collaboration between XSEDE and the European Grid Infrastructure in the Birds of a Feather session. EGI recently set up an EGI Champions scheme, including representatives from fields as disparate as life sciences, environmental sciences and maths. Other areas where EGI and XSEDE expect to work together include federated high throughput data analysis, federated clouds and user support. One use case already in progress covers HPC-HTC workflows for the computational chemistry community. This was one of the joint use cases that emerged in the call issued at the beginning of the year. So there's lots to watch out for, even now the caped crusaders have (mostly) left town.

Sunday, July 21, 2013

High Performance Computing, e-Science and Scalable Architectures Next a Volcano


36 Students from Mexico and some countries of central america have begun the participation in the 2013 version of the Supercomputing and Distributed Systems Camp. This year, the SCCAMP is conducted at the Instituto Tecnológico de Colima, in the beautiful town of Villa de Alvarez in the Colima State, México. As in 2011, the SCCamp is developed near of a volcano: the Colima Volcano.

SCCamp is an initiative of researchers to offer to undergraduate and master students a state-of-the-art knowledge upon High Performance and Distributed Computing topics. In 2013 version, students will learn topics in 7 days summer school, starting from large-scale infrastructures (Cluster, Grid, Cloud) to CUDA Programming.  Instructors are from several countries in the world: Brazil, Colombia, France, Germany, Greece, Mexico and Venezuela.

Past SC-CAMP were made in Colombia (2010), Costa Rica (2011) and Venezuela (2012). Every year an interesting special topic is proposed and selected. The special topic of this year is the use of reconfigurable architectures to scientific applications. 

The SCCAMP 2013  is supported by some international partners.  GridTalk and iSTGW are media partners of SCCAMP.

Thursday, July 18, 2013

iMENTORS goes live!

Mapping ICT accross Sub-Saharan Africa
Brussels, 18/07/2013iMENTORS goes live and is one step closer to becoming the most comprehensive crowdsourcing map on ICT infrastructures in Sub-Saharan Africa! Users are now able to register and create their profile on the platform and start sharing their knowledge and data directly on the map.
Co-funded by the European Commission’s DG CONNECT under the Seventh Framework Programme (FP7), iMENTORS www.imentors.eu is designed to enhance the coherence and effectiveness of international actors involved in e- infrastructures development projects and initiatives in Sub-Saharan Africa. iMENTORS launched in April 2012 by Stockholm University and Gov2u is a web-based platform serving as a knowledge repository for sharing and aggregating data on e-infrastructure projects throughout sub-Saharan Africa.
e-Infrastructures– electronic research infrastructures – are collections of ICT based resources and services used by the worldwide research and education community to conduct collaborative projects and generate, exchange and preserve knowledge.
iMENTORS strengthens the development of European policy in developing research infrastructures by providing donors and policy makers with a comprehensive tool to test hypotheses and understand trends in the field of e-infrastructures development assistance across Sub-Saharan African countries. It increases the value of data by providing more descriptive information about international e-infrastructure development and strengthens efforts to improve donors and recipients strategic planning and coordination. 
By identifying and mapping the majority of e-infrastructure projects in Sub-Saharan Africa, the project provides valuable knowledge of what is currently on the ground which in itself represents the first step to more informed future choices and decision making on e- infrastructures deployment. Additionally, the software tool deployed assists policy-makers and donor agencies in utilizing the information more effectively, by filtering and associating it with other variables to identify on trends in aid flows, areas of interests, allowing them to make more informed decisions. 
The platform also pushes for stronger cooperation between the different e-infrastructures stakeholder groups, hence providing decisive support in their efforts to develop globally connected and interoperable e-infrastructures. 
The dissemination activities that are planned in this project have raised the visibility of the e-infrastructures activity in Sub-Saharan African Region towards wider audiences, especially among the research communities from EU countries and international development aid agencies. By engaging key e-infrastructures stakeholders at the recipient country level, the project triggers debates on the relative effectiveness of donors and create the impetus to move to more effective coordination mechanisms.

For more information visit: www.iMENTORS.eu