热门标签 | HotTags
当前位置:  开发笔记 > 编程语言 > 正文

在.NET中通过线路移动对象最有效的方法是什么?-Whatmethodismostefficientatmovingobjectsacrossthewirein.NET?

IvebeenusingWebServicesatmovingdataacrossthewireandthathasservedmeprettywell.Itex

I've been using WebServices at moving data across the wire and that has served me pretty well. It excels at sending small pieces of data. As soon as you have to move deep object trees with lots of properties, the resulting XML soup takes 100k of data and turns it into a 1MB.

我一直在使用WebServices来移动数据,这对我很有帮助。它擅长发送小块数据。只要您必须移动具有大量属性的深层对象树,生成的XML汤就会获取100k的数据并将其转换为1MB。

So I've tried IIS Compression, but it left me underwhelmed. It compressed data well, but the trade off was in compression/decompression. Then I've serialized the objects via BinaryFormatter and sent that across. This was better, however, speed of encode/decode still remains.

所以我尝试过IIS压缩,但它让我不知所措。它压缩数据很好,但权衡是压缩/解压缩。然后我通过BinaryFormatter序列化对象并将其发送出去。然而,这更好,编码/解码的速度仍然存在。

Anyway, I am hearing that I am stuck in the 00s and now there are better ways to send data across the wire such as ProtocolBuffers, MessagePack, etc...

无论如何,我听说我被困在00s,现在有更好的方法通过网络发送数据,如ProtocolBuffers,MessagePack等...

Can someone tell me whether these new protocols will be better suited for sending large pieces of data and whether I am missing some other efficient ways to do this?

有人能告诉我这些新协议是否更适合发送大量数据,以及我是否缺少其他有效的方法来执行此操作?

By efficient, I mean amount of bandwidth, speed of encode/decode, speed of implementation, etc...

通过有效,我的意思是带宽量,编码/解码速度,实施速度等......

4 个解决方案

#1


11  

It depends on what's making up the bulk of your data. If you've just got lots of objects with a few fields, and it's really the cruft which is "expanding" them, then other formats like Protocol Buffers can make a huge difference. I haven't used MessagePack or Thrift, but I would expect they could have broadly similar size gains.

这取决于构成大部分数据的内容。如果你有很多带有少量字段的对象,并且它真的是“扩展”它们,那么其他格式如协议缓冲区可以产生巨大的差异。我没有使用过MessagePack或Thrift,但我希望它们的大小可以大致相同。

In terms of speed of encoding and decoding, I believe that both Marc Gravell's implementation of Protocol Buffers and my own will outperform any of the built-in serialization schemes.

在编码和解码速度方面,我相信Marc Gravell的Protocol Buffers和我自己的实现都将胜过任何内置的序列化方案。

#2


6  

This depends heavily on where your priorities lie, and what type of client you're using.

这在很大程度上取决于您的优先级所在,以及您正在使用的客户端类型。

WCF provides some great ways to push data across the wire, including many binding options as well as rather efficient serializers such as the DataContractSerializer. That being said, many of these things require a "rich" client that's also using WCF.

WCF提供了一些推送数据的好方法,包括许多绑定选项以及相当高效的序列化器,如DataContractSerializer。话虽这么说,其中许多东西都需要一个同样使用WCF的“富”客户端。

If that's not an option, then something like Protocol Buffers may be a very good approach. This provides a very fast serialization/deserialization as well as a reasonable transmission size for most data.

如果这不是一个选项,那么像协议缓冲区这样的东西可能是一个非常好的方法。这为大多数数据提供了非常快速的序列化/反序列化以及合理的传输大小。

#3


2  

Have you checked out Protobuf-net ??

你有没有检查过Protobuf-net?

#4


1  

Using Simon Hewitt's library (written in C#) to perform the serialisation it is both efficient in terms of bandwidth and speed of encoding/decoding. The library is distributed as C# source code.

使用Simon Hewitt的库(用C#编写)来执行序列化,它在带宽和编码/解码速度方面都很有效。该库作为C​​#源代码分发。

The only gotcha is to avoid .NET serialisation to kick in - for data structures not supported by the library explicit encoding must be done in client code to avoid .NET serialisation to be invoked. This affects the speed of implementation, but I got an improvement of a factor of 2-3 for the size and a factor of 20-40 times for the time taken for the serialisation.

唯一的问题是避免使用.NET序列化 - 对于库不支持的数据结构,必须在客户端代码中进行显式编码,以避免调用.NET序列化。这会影响实现的速度,但是对于大小而言,我的性能提高了2-3倍,而对于序列化,我的性能提高了20-40倍。


推荐阅读
author-avatar
我是黄小果1960
这个家伙很懒,什么也没留下!
PHP1.CN | 中国最专业的PHP中文社区 | DevBox开发工具箱 | json解析格式化 |PHP资讯 | PHP教程 | 数据库技术 | 服务器技术 | 前端开发技术 | PHP框架 | 开发工具 | 在线工具
Copyright © 1998 - 2020 PHP1.CN. All Rights Reserved | 京公网安备 11010802041100号 | 京ICP备19059560号-4 | PHP1.CN 第一PHP社区 版权所有