Monday, May 26, 2008

Diderot and the Semantic Web

I've read an interesting post in this blog, about the Relevance of Information and the myths of a Semantic Web. I'm afraid it is not fully related to Data Access, nonetheless it is interesting. Please note it is in French.

The author made an extremely valuable effort to express himself about the expectations and limitations of the Semantic Web. He indeed has some good (and sometimesfunny) ideas, even if he has a poor knowledge of IA and thus derives some vague statements about it.

A few remarks about it:

  • The goal of the Semantic Web could not be limited to be a smart replacement of Google.

  • The ontologies do not provide a single version of the Thruth, and they can mimic human redundancies, overlaps, incompletion.

  • Dedicated semantic networks are not supposed to be easily connected to build a global brain.

  • Open source has just nothing to do with that.

  • Inference engines are only one facet of IA, IA is much more complex than that.

  • IA is not limited to problem resolution, neuronal networks, straight boolean logic.

  • Advanced IA will try to simulate and use human techniques (distributed IA...), and will deal with fuzzy logics (temporal logic, epistemic logic, know and believe logic, fear and love, try and see...).

  • This is not because IA and Semantic Web won't solve everything that they won't solve something.

  • Last but not least: the Semantic Web is not competing with human intelligence.


We can agree on:

  • A real Semantic Web is not for tomorrow morning.

  • There is a lot of hype in Web 2.0 (and everything which is "2.0" or "as a service").

    • I recently even saw Data 2.0.




Two articles related to it:

Diderot and the Semantic Web

I've read an interesting post in this blog, about the Relevance of Information and the myths of a Semantic Web. I'm afraid it is not fully related to Data Access, nonetheless it is interesting. Please note it is in French.

The author made an extremely valuable effort to express himself about the expectations and limitations of the Semantic Web. He indeed has some good (and sometimesfunny) ideas, even if he has a poor knowledge of IA and thus derives some vague statements about it.

A few remarks about it:

  • The goal of the Semantic Web could not be limited to be a smart replacement of Google.

  • The ontologies do not provide a single version of the Thruth, and they can mimic human redundancies, overlaps, incompletion.

  • Dedicated semantic networks are not supposed to be easily connected to build a global brain.

  • Open source has just nothing to do with that.

  • Inference engines are only one facet of IA, IA is much more complex than that.

  • IA is not limited to problem resolution, neuronal networks, straight boolean logic.

  • Advanced IA will try to simulate and use human techniques (distributed IA...), and will deal with fuzzy logics (temporal logic, epistemic logic, know and believe logic, fear and love, try and see...).

  • This is not because IA and Semantic Web won't solve everything that they won't solve something.

  • Last but not least: the Semantic Web is not competing with human intelligence.


We can agree on:

  • A real Semantic Web is not for tomorrow morning.

  • There is a lot of hype in Web 2.0 (and everything which is "2.0" or "as a service").

    • I recently even saw Data 2.0.




Two articles related to it:

Friday, May 23, 2008

Apache Tuscany is now an official ASF project

In this post on TSS they insist on Tuscany being the OASIS OpenCSA SCA implementation. But don't forget it is also the SDO and DAS Apache implementation.

See also Luciano Resende's blog about that.

Apache Tuscany is now an official ASF project

In this post on TSS they insist on Tuscany being the OASIS OpenCSA SCA implementation. But don't forget it is also the SDO and DAS Apache implementation.

See also Luciano Resende's blog about that.

Referential Integrity Constraints in EF

Here is how the Referential Integrity Constraints are defined and managed in the Entity Framework.

It is defined in the CSDL file (conceptual schema).

I really appreciate the fact associations are defined outside entities. That is probably the best way of doing it. I still don't understand why (most industrial) object languages don't have a real management of relationships.

One of the key new features of Java or C# would be to have a better relationship management in the future. Basically, a reference in Java is nothing more than a wrapper around a memory pointer. And a collection is just a set of references. Nothing more, there is no notion of what is at the other side, and no relationship management.

Even more dynamic languages like Smalltalk or Groovy still don't have that good relationship management.

That said, even in database engines the notion of relationship management is still limited. Yes, you can define FK and associate some behaviour, but all that is still limited. We need much more. We need to be able to fully qualify both sides of the association (cardinality, navigation...) and later use that information at runtime to optimize queries. We need to be able to distinguish between aggregations and compositions. And we must be able to define Referential Integrity Constraints to something which is not in the same database.

All this maybe means that relationship management should be covered by the Data Services Platform. And even like that we still need a better notion of relationship within programming languages.

Referential Integrity Constraints in EF

Here is how the Referential Integrity Constraints are defined and managed in the Entity Framework.

It is defined in the CSDL file (conceptual schema).

I really appreciate the fact associations are defined outside entities. That is probably the best way of doing it. I still don't understand why (most industrial) object languages don't have a real management of relationships.

One of the key new features of Java or C# would be to have a better relationship management in the future. Basically, a reference in Java is nothing more than a wrapper around a memory pointer. And a collection is just a set of references. Nothing more, there is no notion of what is at the other side, and no relationship management.

Even more dynamic languages like Smalltalk or Groovy still don't have that good relationship management.

That said, even in database engines the notion of relationship management is still limited. Yes, you can define FK and associate some behaviour, but all that is still limited. We need much more. We need to be able to fully qualify both sides of the association (cardinality, navigation...) and later use that information at runtime to optimize queries. We need to be able to distinguish between aggregations and compositions. And we must be able to define Referential Integrity Constraints to something which is not in the same database.

All this maybe means that relationship management should be covered by the Data Services Platform. And even like that we still need a better notion of relationship within programming languages.

Thursday, May 22, 2008

The JCP at the Paris JUG

Yesterday, I've been invited to participate to a presentation / roundtable about the JCP by the Parisian JUG. Patrick Curran, chairman of the JCP, did a very nice presentation, also introducing the future trends for the JCP.

We then had good questions from the audience:

  • Is there any collaboration effort / alignment between the JSRs?

    • This needs to be improved and will be.



  • What will happen to the JCP if Sun is acquired and/or not interested any longer into Java?

    • First this will unlikely happen.

    • Second, in that case the JCP will turn into a kind of fundation.



  • Is there a special committee in charge of defining what will be in Java SE 7?

    • No, that's a normal JSR.

    • We could have a specific long-term architecture committee for the Java platform later.



  • Why Apache is managing the JDO2 TCK / RI while there is no JSR for JDO 2.0?

    • This is just a maintenance release of JDO2, there is no need for a specific JSR.

    • Collaboration with open source communities is increasing.



  • A question about JNI (I can't remember it, sorry)

  • Is there a TCK framework that can be reused for new JSRs?

    • Yes there is one.



  • What can the JCP do top prevent fragmentation of Java (SWT, SCA, OSGi...)?

    • There is nothing to do, private innovation is norma. In the long-term, once successfull these initiatives could join the specification effort.



  • Java has a long list of old JSRs. Is there a cleaning process?

    • I can't remember Patrick's answer to that one, sorry.



  • Who decide which features are part of Java EE or Java SE?

    • The expert groups in the JSRs.



  • What Sun is doing to develop Java on mobile devices against initiatives from Google and Adobe?

    • Nokia, Ericsson and other major players in this area are already part of the JCP and are leading JSRs.




After a short break, we had the roundtable with Patrick Curran, Antonio Goncalves (co-leader of the Paris JUG), Guillaume Laforge (Groovy Spec Lead), and Cedric Thomas who runs OW2 (Object Web 2.0), the open source consortium. We had a good and animated debate with sharp questions:

  • How to simplify the licensing of TCK

    • Working on it...



  • Are free RIs (reference implementations) a threat for vendors?

    • No, it is not.



  • Do we really TCKs

    • I said, it is rather a tool than a constraint

    • We embed into our own testsuite

    • It helps improving the specification



  • Need for more process for the specification work (timelines, voting rules, etc.)

  • Should performance and QoS be part of the TCK?

    • No the TCK just checks features compliance

    • Vendors will then compare products based on performance, quality...



  • Why Java FX is not defined within the JCP?

    • That used to be a personal initiative, it could join the JCP later.



  • Why Java is the only language with an heavy specification process?

    • Java is not alone, there are numerous IT standardization organizations (400 just in the US according to Patrick).

    • OASIS is another example.



  • Once again a question about Java SE 7...


Then after that good roundtable, it was the time for some beers: we had plenty of time to reinvent the whole IT space.

Thank you to Antonio Goncalves and David for this nice event. Antonio is a talented IT expert and consultant, he is also a member of the JPA 2 expert group, and used to be one of the first customers of Xcalia. He also likes Jaco Pastorius, the legendary jazz bass player, which proves he also has a good taste and genuine values.

The JCP at the Paris JUG

Yesterday, I've been invited to participate to a presentation / roundtable about the JCP by the Parisian JUG. Patrick Curran, chairman of the JCP, did a very nice presentation, also introducing the future trends for the JCP.

We then had good questions from the audience:

  • Is there any collaboration effort / alignment between the JSRs?

    • This needs to be improved and will be.



  • What will happen to the JCP if Sun is acquired and/or not interested any longer into Java?

    • First this will unlikely happen.

    • Second, in that case the JCP will turn into a kind of fundation.



  • Is there a special committee in charge of defining what will be in Java SE 7?

    • No, that's a normal JSR.

    • We could have a specific long-term architecture committee for the Java platform later.



  • Why Apache is managing the JDO2 TCK / RI while there is no JSR for JDO 2.0?

    • This is just a maintenance release of JDO2, there is no need for a specific JSR.

    • Collaboration with open source communities is increasing.



  • A question about JNI (I can't remember it, sorry)

  • Is there a TCK framework that can be reused for new JSRs?

    • Yes there is one.



  • What can the JCP do top prevent fragmentation of Java (SWT, SCA, OSGi...)?

    • There is nothing to do, private innovation is norma. In the long-term, once successfull these initiatives could join the specification effort.



  • Java has a long list of old JSRs. Is there a cleaning process?

    • I can't remember Patrick's answer to that one, sorry.



  • Who decide which features are part of Java EE or Java SE?

    • The expert groups in the JSRs.



  • What Sun is doing to develop Java on mobile devices against initiatives from Google and Adobe?

    • Nokia, Ericsson and other major players in this area are already part of the JCP and are leading JSRs.




After a short break, we had the roundtable with Patrick Curran, Antonio Goncalves (co-leader of the Paris JUG), Guillaume Laforge (Groovy Spec Lead), and Cedric Thomas who runs OW2 (Object Web 2.0), the open source consortium. We had a good and animated debate with sharp questions:

  • How to simplify the licensing of TCK

    • Working on it...



  • Are free RIs (reference implementations) a threat for vendors?

    • No, it is not.



  • Do we really TCKs

    • I said, it is rather a tool than a constraint

    • We embed into our own testsuite

    • It helps improving the specification



  • Need for more process for the specification work (timelines, voting rules, etc.)

  • Should performance and QoS be part of the TCK?

    • No the TCK just checks features compliance

    • Vendors will then compare products based on performance, quality...



  • Why Java FX is not defined within the JCP?

    • That used to be a personal initiative, it could join the JCP later.



  • Why Java is the only language with an heavy specification process?

    • Java is not alone, there are numerous IT standardization organizations (400 just in the US according to Patrick).

    • OASIS is another example.



  • Once again a question about Java SE 7...


Then after that good roundtable, it was the time for some beers: we had plenty of time to reinvent the whole IT space.

Thank you to Antonio Goncalves and David for this nice event. Antonio is a talented IT expert and consultant, he is also a member of the JPA 2 expert group, and used to be one of the first customers of Xcalia. He also likes Jaco Pastorius, the legendary jazz bass player, which proves he also has a good taste and genuine values.

Wednesday, May 21, 2008

Entity Framework FAQ

An excellent FAQ found on Danny Simmons' blog.

Entity Framework FAQ

An excellent FAQ found on Danny Simmons' blog.

Data Services World

I forgot to mention that SysCon will organize a key event for Data Services.

This will be in June, in New York and John Goodson will represent DataDirect Technologies (we are a diamond sponsor of this event).

http://dataservicesworld.sys-con.com/

Data Services World

I forgot to mention that SysCon will organize a key event for Data Services.

This will be in June, in New York and John Goodson will represent DataDirect Technologies (we are a diamond sponsor of this event).

http://dataservicesworld.sys-con.com/

How Data Services Are the New Frontier for Data Integration

John Goodson, VP & GM of DataDirect Technologies (my boss then) is interviewed by Jeremy Geelan (SysCon).

http://www.sys-con.com/read/529442.htm.

I can obviously only agree 100% with what is said here.

How Data Services Are the New Frontier for Data Integration

John Goodson, VP & GM of DataDirect Technologies (my boss then) is interviewed by Jeremy Geelan (SysCon).

http://www.sys-con.com/read/529442.htm.

I can obviously only agree 100% with what is said here.

Java Object Persistence: State of the Union Part II

Second part of the Java Object Persistence from ODBMS.org with various experts from Microsoft, Progress (ObjectStore), Versant (Robert Greene),University of Austin and Rick Cattell.

See the point of Microsoft about the Impedance Mismatch in Data Services. And also how they position the Entity Framework versus traditional ORM (something going further according to them because of the well-defined EDM). They also think RDBMS will natively support kind of EDM and will use LINQ in their Stored Procs Programming Languages.

An old but good paper from Microsoft about LINQ and EF. It was announcing the future... which is now!

LINQ is perceived as important by almost everybody. Native queries and AutoFetch are other interesting projects mentioned by University of Austin (already blogged here).

Java Object Persistence: State of the Union Part II

Second part of the Java Object Persistence from ODBMS.org with various experts from Microsoft, Progress (ObjectStore), Versant (Robert Greene),University of Austin and Rick Cattell.

See the point of Microsoft about the Impedance Mismatch in Data Services. And also how they position the Entity Framework versus traditional ORM (something going further according to them because of the well-defined EDM). They also think RDBMS will natively support kind of EDM and will use LINQ in their Stored Procs Programming Languages.

An old but good paper from Microsoft about LINQ and EF. It was announcing the future... which is now!

LINQ is perceived as important by almost everybody. Native queries and AutoFetch are other interesting projects mentioned by University of Austin (already blogged here).

ODBMS: Quo Vadis?

http://www.odbms.org/blog/2008/05/object-database-systems-quo-vadis.html

Nothing really new but the LINQ part.

ODBMS: Quo Vadis?

http://www.odbms.org/blog/2008/05/object-database-systems-quo-vadis.html

Nothing really new but the LINQ part.

Entity Framework: more videos

http://blogs.msdn.com/adonet/archive/2008/05/20/how-do-i-new-entity-framework-videos.aspx

Entity Framework: more videos

http://blogs.msdn.com/adonet/archive/2008/05/20/how-do-i-new-entity-framework-videos.aspx

Entity Framework and ORM

Microsoft is still trying to position the Entity Framework versus more traditional ORM approaches.

Seen this blog entry about this, on this interesting blog from Zlatko Michailov.

Summary:

  • EF is >> than ORM

  • ORM is basically: Object Model --- --- DB Model

  • EF is: Object Model --- Conceptual Model --- DB Models


The conceptual model is the EDM (Entity Data Model).

What they say is that the "mapping facility" of EF is just the mapping between the object model and the conceptual model, this "OCM" is the equivalent of an ORM. Then you'll find a second mapping layer between the conceptual and the database models.

Understood, but I still need more info, examples, use cases about the real added value of that intermediate level in the case of a relational database. Numerous posts about EF just say it is better to have this intermediate model but they rarely explain why. In one post someone argued that the mapping with the object model is easier because the conceptual model knows concepts like relationships and inheritance. OK, but this just transfer the complex mappings to the other side.

Conversely, I do agree this intermediate level is required when accessing data sources with not enough metadata, like a green screen of a mainframe application for instance. At DataDirect/Xcalia we have an extented metadata language to describe the virtual data model manipulated by a service-oriented data source. We do the mapping between the business (object) model and that intermediate model and we do have a second mapping.

In the case of the EF, the mapping features between the object and conceptual models is quite limited at the moment (1-1 mapping), limiting the potential benefits of that approach.

Entity Framework and ORM

Microsoft is still trying to position the Entity Framework versus more traditional ORM approaches.

Seen this blog entry about this, on this interesting blog from Zlatko Michailov.

Summary:

  • EF is >> than ORM

  • ORM is basically: Object Model --- --- DB Model

  • EF is: Object Model --- Conceptual Model --- DB Models


The conceptual model is the EDM (Entity Data Model).

What they say is that the "mapping facility" of EF is just the mapping between the object model and the conceptual model, this "OCM" is the equivalent of an ORM. Then you'll find a second mapping layer between the conceptual and the database models.

Understood, but I still need more info, examples, use cases about the real added value of that intermediate level in the case of a relational database. Numerous posts about EF just say it is better to have this intermediate model but they rarely explain why. In one post someone argued that the mapping with the object model is easier because the conceptual model knows concepts like relationships and inheritance. OK, but this just transfer the complex mappings to the other side.

Conversely, I do agree this intermediate level is required when accessing data sources with not enough metadata, like a green screen of a mainframe application for instance. At DataDirect/Xcalia we have an extented metadata language to describe the virtual data model manipulated by a service-oriented data source. We do the mapping between the business (object) model and that intermediate model and we do have a second mapping.

In the case of the EF, the mapping features between the object and conceptual models is quite limited at the moment (1-1 mapping), limiting the potential benefits of that approach.

Tuesday, May 20, 2008

Introduction to EntityDataSource

Crystal clear, by Guy Burstein: http://blogs.microsoft.co.il/blogs/bursteg/archive/2008/05/12/visual-studio-sp1-entitydatasource-overview-screencast.aspx

The new Visual Studio 2008 SP is coming with a new Entity DataSource that can be bound to visual widgets in ASP .Net pages.

Introduction to EntityDataSource

Crystal clear, by Guy Burstein: http://blogs.microsoft.co.il/blogs/bursteg/archive/2008/05/12/visual-studio-sp1-entitydatasource-overview-screencast.aspx

The new Visual Studio 2008 SP is coming with a new Entity DataSource that can be bound to visual widgets in ASP .Net pages.

Lazy loading with Entity Framework

Series of good blog entries about how to find workarounds for lazy loading of entities:

http://datadeveloper.net/blogs/news/archive/2008/05/19/entity-framework-and-lazy-loading.aspx

http://www.singingeels.com/Articles/Entity_Framework_and_Lazy_Loading.aspx

http://blogs.msdn.com/jkowalski/archive/2008/05/12/transparent-lazy-loading-for-entity-framework-part-1.aspx

http://blogs.msdn.com/diego/archive/2008/05/12/lazy-loading-in-entity-framework.aspx

Lazy loading with Entity Framework

Series of good blog entries about how to find workarounds for lazy loading of entities:

http://datadeveloper.net/blogs/news/archive/2008/05/19/entity-framework-and-lazy-loading.aspx

http://www.singingeels.com/Articles/Entity_Framework_and_Lazy_Loading.aspx

http://blogs.msdn.com/jkowalski/archive/2008/05/12/transparent-lazy-loading-for-entity-framework-part-1.aspx

http://blogs.msdn.com/diego/archive/2008/05/12/lazy-loading-in-entity-framework.aspx

Monday, May 19, 2008

Why use the Entity Framework?

I have seen this article in Danny Simmons' blog. He positions the Entity Framework versus various Data Access technologies like ADO.Net, LINQ to SQL and nHibernate.

He first differentiates it from ADO.Net and LINQ to SQL by explaining EF is doing mapping. Then he differentiates it from nHibernate explaining that EF is not better at doing mapping but it is different because it relies on EDM, thus it makes a distinction between the process of mapping queries/shaping results and building objects and tracking changes. This would make this layer reusable in different contexts. It means the same conceptual Data Model (Business Model) could be reused in different scenarios. I really need more details about that real benefits, besides the fact Microsoft intends to widely use EDM internally. We understand why EDM is important to Microsoft, we don't really get the technical benefits of this approach for the users.

Please see how Julie Lerman explains in her blog why Microsoft is taking care of the nHibernate community trying to avoid the flame war which happened with Hibernate in the Java Community.

Also see reactions from readers in both blogs.

Why use the Entity Framework?

I have seen this article in Danny Simmons' blog. He positions the Entity Framework versus various Data Access technologies like ADO.Net, LINQ to SQL and nHibernate.

He first differentiates it from ADO.Net and LINQ to SQL by explaining EF is doing mapping. Then he differentiates it from nHibernate explaining that EF is not better at doing mapping but it is different because it relies on EDM, thus it makes a distinction between the process of mapping queries/shaping results and building objects and tracking changes. This would make this layer reusable in different contexts. It means the same conceptual Data Model (Business Model) could be reused in different scenarios. I really need more details about that real benefits, besides the fact Microsoft intends to widely use EDM internally. We understand why EDM is important to Microsoft, we don't really get the technical benefits of this approach for the users.

Please see how Julie Lerman explains in her blog why Microsoft is taking care of the nHibernate community trying to avoid the flame war which happened with Hibernate in the Java Community.

Also see reactions from readers in both blogs.

Thursday, May 15, 2008

Transfer ADO EF entities

This article introduces the EntityBag, as a way to export Entities out of their persistence context. Exactly what JDO and JPA also proposed with attach/detach. This is also not too far from SDO.

Transfer ADO EF entities

This article introduces the EntityBag, as a way to export Entities out of their persistence context. Exactly what JDO and JPA also proposed with attach/detach. This is also not too far from SDO.

The impact of the cloud on BI

http://www.databasecolumn.com/2008/05/cloud-and-bi.html

The author envisions some impacts of cloud computing on the BI software industry:

  • easier evaluation (nothing to download, no machine / space / skills required to install, no configuration, nothing to uninstall)

  • more short-term ad-hoc analysis

  • more BI projects because funding is easier and less resources required

  • extension of data warehouse projects to medium size companies

  • analytic SaaS market will grow quickly


Well, maybe it won't be so easy, it will be interesting to track real projects going in this direction and how they succeed/fail.

Obviously this "BI over the cloud" requires a new kind of database technology (database column is linked with Vertica):

  • Share-nothing architecture, to quickly asborb peaks in on-demand analysis.

  • Aggressive data compression to reduce storage costs. ==> Agreed, but what about performance?

  • Automatic grid replication and failover ==> The cloud must prove it is at least as robust as an internal installation.

The impact of the cloud on BI

http://www.databasecolumn.com/2008/05/cloud-and-bi.html

The author envisions some impacts of cloud computing on the BI software industry:

  • easier evaluation (nothing to download, no machine / space / skills required to install, no configuration, nothing to uninstall)

  • more short-term ad-hoc analysis

  • more BI projects because funding is easier and less resources required

  • extension of data warehouse projects to medium size companies

  • analytic SaaS market will grow quickly


Well, maybe it won't be so easy, it will be interesting to track real projects going in this direction and how they succeed/fail.

Obviously this "BI over the cloud" requires a new kind of database technology (database column is linked with Vertica):

  • Share-nothing architecture, to quickly asborb peaks in on-demand analysis.

  • Aggressive data compression to reduce storage costs. ==> Agreed, but what about performance?

  • Automatic grid replication and failover ==> The cloud must prove it is at least as robust as an internal installation.

Manipulate Data in the Cloud with ADO.NET

Good article seen on the Visual Studio Magazine.

It introduces some new features of the recently released new version (VS 2008 SP1).

Good list of links at the end of the article.

ADO.Net Data Services is going in the REST / Atom direction. Tenants of this approach claim it is simpler than SOAP. When we look at the tools and framework provided by Microsoft we can see that REST is actually hidden by APIs and LINQ. All the REST statements are computed and issued by the framework. There is almost no difference with what an SDO / DAS client programmer will experience.

This other article about the Entity Framework, is also of interest. Related: Microsoft docs about the Entity Framework.

Manipulate Data in the Cloud with ADO.NET

Good article seen on the Visual Studio Magazine.

It introduces some new features of the recently released new version (VS 2008 SP1).

Good list of links at the end of the article.

ADO.Net Data Services is going in the REST / Atom direction. Tenants of this approach claim it is simpler than SOAP. When we look at the tools and framework provided by Microsoft we can see that REST is actually hidden by APIs and LINQ. All the REST statements are computed and issued by the framework. There is almost no difference with what an SDO / DAS client programmer will experience.

This other article about the Entity Framework, is also of interest. Related: Microsoft docs about the Entity Framework.

Monday, May 5, 2008

HBase

HBase, yet another alternate distributed column-oriented database technology, inspired by Google's BigTable. It is hosted within the Apache Hadoop project (MapReduce implementation).

InfoQ article about HBase.

Interesting excerpt about Google's App Engine:
However, as a lot of people have noted since the announcement of App Engine, there's a big difference between owning your infrastructure and renting it. It's probably a very good thing for you when you are small, but as soon as you reach a surprisingly low threshold, you're better off hosting it yourself.

HBase

HBase, yet another alternate distributed column-oriented database technology, inspired by Google's BigTable. It is hosted within the Apache Hadoop project (MapReduce implementation).

InfoQ article about HBase.

Interesting excerpt about Google's App Engine:
However, as a lot of people have noted since the announcement of App Engine, there's a big difference between owning your infrastructure and renting it. It's probably a very good thing for you when you are small, but as soon as you reach a surprisingly low threshold, you're better off hosting it yourself.

LINQ for Java

Also seen on the db4o community, but it could be of interest to everybody: http://developer.db4o.com/blogs/carl/archive/2008/05/02/linq-for-java.aspx

Old post about Java and LINQ.

Carl (db4o's CTO) would really like to start this initiative, as an open source development, as a JSR and as an Eclipse plugin. A Google group has already been started.

Please note IBM PureQuery (as part of their Data Studio) has some sort of limited support for LINQ. USed to be known as JLINQ.

Quaere, another limited LINQ for Java implementation, hosted on CodeHaus.

Jonathan Bruce's post about Quaere.

I've also seen that one: JoSQL.

Charlie Calvert's links on LINQ. At Xcalia, we enjoyed working with Charlie's team in the TAP program in 2007, mostly around LINQ.

LINQ for Java

Also seen on the db4o community, but it could be of interest to everybody: http://developer.db4o.com/blogs/carl/archive/2008/05/02/linq-for-java.aspx

Old post about Java and LINQ.

Carl (db4o's CTO) would really like to start this initiative, as an open source development, as a JSR and as an Eclipse plugin. A Google group has already been started.

Please note IBM PureQuery (as part of their Data Studio) has some sort of limited support for LINQ. USed to be known as JLINQ.

Quaere, another limited LINQ for Java implementation, hosted on CodeHaus.

Jonathan Bruce's post about Quaere.

I've also seen that one: JoSQL.

Charlie Calvert's links on LINQ. At Xcalia, we enjoyed working with Charlie's team in the TAP program in 2007, mostly around LINQ.

LINQ and db4o

Interesting to see that more and more vendors are embracing LINQ.

See these posts from the db4o community:

LINQ for db4o.

LINQ and db4o

Interesting to see that more and more vendors are embracing LINQ.

See these posts from the db4o community:

LINQ for db4o.