• RELEVANCY SCORE 3.97

    DB:3.97:Pof-Config File Is Not Validated? zm





    I mistyped pof-config.xml as below

    pof-config
    includecoherence-pof-config.xml/include!-------------- Wrong place ---------------
    user-type-list
    !------ Correct place ---------------------------
    user-type
    type-id1002/type-id
    class-name
    org.mylab.domain.Customer
    /class-name
    serializer
    class-name
    org.mylab.serializer.CustomerSerializer
    /class-name
    /serializer
    /user-type
    /user-type-list
    /pof-config

    But coherence server does not throw exeception during startup, but thrown some serialization error after cache put operation. I expect coherence server to validate the configuration files during startup.

    DB:3.97:Pof-Config File Is Not Validated? zm

    Hi Akilan,

    Coherence does not validate any of its XML configuration files. You can look at this as a good thing or a bad thing depending on your point of view. Personally I find it a good thing as it means I can put my own XML into the config files. In fact the Incubator project makes good use of this.

    The best thing to do is configure your IDE to validate the files using the DTD files in the Coherence jar.

    JK

  • RELEVANCY SCORE 3.72

    DB:3.72:Pof Serialization Issue (Hashmap) sz





    Hi.

    It looks like the following use, produced EOFException error in java serialization.

    I couldn't find any clear documentation regarding whether readMap (java) should be paired to readDictionary (.net) for pof, but currently by using a type pair of DictionaryString, Double in .NET side and HashMapString, Double in java we started getting the error below.

    Is it a wrong use? Should Hashtable be used instead? Or some other collection? Please advise.

    Error: An exception occurred while decoding a Message for Service=Proxy:ExtendTcpProxyService:TcpAcceptor received from: TcpConnection(Id={CUT}, Open=true, Member(Id=0, Timestamp=3981-11-15 06:40:05.166, Address=10.111.12.147:0, MachineId=0, Location={CUT}, Role=.NET RTC client), {CUT}): java.io.EOFException

    at com.tangosol.io.nio.ByteBufferReadBuffer$ByteBufferInput.readByte(ByteBufferReadBuffer.java:340)
    at com.tangosol.io.AbstractReadBuffer$AbstractBufferInput.readUnsignedByte(AbstractReadBuffer.java:435)
    at com.tangosol.io.AbstractReadBuffer$AbstractBufferInput.readPackedInt(AbstractReadBuffer.java:560)
    at com.tangosol.io.MultiBufferReadBuffer$MultiBufferInput.readPackedInt(MultiBufferReadBuffer.java:683)
    at com.tangosol.io.pof.PofBufferReader.readAsUniformObject(PofBufferReader.java:3344)
    at com.tangosol.io.pof.PofBufferReader.readMap(PofBufferReader.java:2537)

    Java Pof where it occurs:

    writer.writeMap(40, getMyDict());
    setMyDict((HashMapString, Double)reader.readMap(40, new HashMapString, Double()));

    public HashMapString, Double getMyDict() {
    return myDict;
    }

    public void setMyDict(HashMapString, Double MyDict) {
    this.myDict = MyDict;
    }

    .NET Pof:

    writer.WriteDictionaryString, Double(40, MyDict);
    MyDict = ((DictionaryString, Double)reader.ReadDictionaryString, Double(40, new DictionaryString, Double()));

    public DictionaryString, Double MyDict
    {
    get { return myDict; }
    set { myDict = value; }
    }

    Notes: If it helps, 40 is the last pof index on that object. The error appears sometime, not constantly based on data.

    DB:3.72:Pof Serialization Issue (Hashmap) sz

    Could you please provide us a reproducer so we can look into the problem?

    Thanks,
    Luk

  • RELEVANCY SCORE 3.63

    DB:3.63:Coherence Pof Java.Io.Notserializableexception On Standalone Webcenter Spaces Server. dc





    Hi,Im using coherence(V3.6.1) with POF serialization in ADF web application. I have setup local coherence server node and Jdev integrated webcenter server in my local system(windows 7 OS), the application works perfectly when I run in local Jdeveloper integrated server. But when I deploy the application on dev environment ( standalone webcenter spaces server v11G on Unix OS) we are getting exception (Wrapped) java.io.NotSerializableException: com.enbridge.co.ux.coherence.pojos.CommodityPojoIt seems that POF is not configured properly on the standalone webcenter (spaces) server, please help finding out the solution for this issue. On standalone webcenter spaces server is there any extra configurations we have to do? The error logs and configuration filess I have used: 1. Exception logs:14-Jun-2013 3:43:04 o'clock AM MDT Error HTTP BEA-101216 Servlet: "MMFBootStrapServlet" failed to preload on startup in Web application: "MMFUIPortal-Portal-context-root".(Wrapped) java.io.NotSerializableException: com.enbridge.co.ux.coherence.pojos.CommodityPojo at com.tangosol.util.ExternalizableHelper.toBinary(ExternalizableHelper.java:215) at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ConverterValueToBinary.convert(PartitionedCache.CDB:3) at com.tangosol.util.ConverterCollections$ConverterMap.put(ConverterCollections.java:1578) at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.put(PartitionedCache.CDB:1) at com.tangosol.coherence.component.util.SafeNamedCache.put(SafeNamedCache.CDB:1) Truncated. see log file for complete stacktraceCaused By: java.io.NotSerializableException: com.enbridge.co.ux.coherence.pojos.CommodityPojo at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1164) at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:330) at java.util.ArrayList.writeObject(ArrayList.java:570) at sun.reflect.GeneratedMethodAccessor983.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) Truncated. see log file for complete stacktrace2. 2. tangosol-coherence-override.xml:?xml version='1.0'?coherence xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://xmlns.oracle.com/coherence/coherence-operational-config" xsi:schemaLocation="http://xmlns.oracle.com/coherence/ coherence-operational-config coherence-operational-config.xsd" cluster-config member-identity cluster-namemmf_Coh_Cluster/cluster-name /member-identity multicast-listener address224.3.6.0/address port60001/port time-to-live0/time-to-live /multicast-listener serializers serializer class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name init-params init-param param-typejava.lang.String/param-type param-value system-property="pof.config"mmfui-pof-config.xml/param-value /init-param /init-params /serializer /serializers /cluster-config configurable-cache-fact0ry-config init-params init-param param-typejava.lang.String/param-type param-value system-property="tangosol.coherence.cacheconfig"mmfui-cache-config.xml/param-value /init-param /init-params /configurable-cache-fact0ry-config/coherence3. mmfui-cache-config.xml?xml version="1.0"?!DOCTYPE cache-config SYSTEM "cache-config.dtd"cache-configdefaults serializerpof/serializer/defaults caching-scheme-mapping cache-mapping cache-namemmfcache/cache-name scheme-nameExamplesPartitionedPofScheme/scheme-name /cache-mapping /caching-scheme-mapping caching-schemes distributed-scheme scheme-nameExamplesPartitionedPofScheme/scheme-name service-namePartitionedPofCache/service-name serializer instance class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name init-params init-param param-typeString/param-type param-value system-property="pof.config"mmfui-pof-config.xml/param-value /init-param /init-params /instance /serializer backing-map-scheme local-scheme !-- each node will be limited to 250MB -- high-units250M/high-units unit-calculatorbinary/unit-calculator /local-scheme /backing-map-scheme autostarttrue/autostart /distributed-scheme /caching-schemes/cache-config 4. mmfui-pof-config.xml?xml version="1.0"?!DOCTYPE pof-config SYSTEM "pof-config.dtd"pof-config user-type-list !-- coherence POF user types -- includecoherence-pof-config.xml/include !-- com.tangosol.examples package -- user-type type-id1001/type-id class-namecom.enbridge.co.ux.coherence.pojos.CommodityPojo/class-name /user-type /user-type-list /pof-config5. ~-cache-server.sh file on coherence distributionCOHERENCE_HOME=/u01/app/oracle/product/fmw11g/coherence_3.6JAVA_HOME=/usr/java/jdk1.6.0_24PATH=$PATH:$JAVA_HOME/binCONFIG_HOME=/u01/app/oracle/product/fmw11g/coherence_3.6/mmfconfig# specify the JVM heap sizeMEMORY=512m if [ ! -f ${COHERENCE_HOME}/bin/cache-server.sh ]; then echo "coherence.sh: must be run from the Coherence installation directory." exitfi if [ -f $JAVA_HOME/bin/java ]; then JAVAEXEC=$JAVA_HOME/bin/javaelse JAVAEXEC=javaFi COH_OPTS=$COH_OPTS -Dtangosol.coherence.distributed.localstorage=true -Dtangosol.coherence.cluster=mmf_Coh_Cluster -Dtangosol.coherence.clusterport=60001 -Dtangosol.coherence.clusteraddress=224.3.6.0 -Dtangosol.coherence.cacheconfig=~/mmfui-cache-config.xml JAVA_OPTS="-Xms$MEMORY -Xmx$MEMORY -Dtangosol.pof.enabled=true -Dtangosol.pof.config=~/mmfui-pof-config.xml" $JAVAEXEC $COH_OPTS -server -showversion $JAVA_OPTS -cp "$COHERENCE_VAR:$CONFIG_HOME:$COHERENCE_HOME/lib/coherence.jar:$COHERENCE_HOME/lib/coherence-common-1.5.0.jar:pojoClasses.jar" com.tangosol.net.DefaultCacheServer $1 6.COHERENCE_PROPERTIES variable added in setDomainEnv.sh file:COHERENCE_PROPERTIES=-Dtangosol.coherence.distributed.localstorage=false -Dtangosol.coherence.clusteraddress=224.3.6.0 -Dtangosol.coherence.clusterport=60001 -Dtangosol.coherence.cluster=mmf_Coh_Cluster -Dtangosol.coherence.override=~/tangosol-coherence-override.xml -Dtangosol.coherence.cacheconfig=~/mmfui-cache-config.xml -Dpof.config=~/ mmfui-pof-config.xml JAVA_PROPERTIES=-Dplatform.home=${WL_HOME} -Dwls.home=${WLS_HOME} -Dweblogic.home=${WLS_HOME} ${COHERENCE_PROPERTIES}

    DB:3.63:Coherence Pof Java.Io.Notserializableexception On Standalone Webcenter Spaces Server. dc

    does the below class implement java.io.Serializablecom.enbridge.co.ux.coherence.pojos.CommodityPojoif not implement serializable

  • RELEVANCY SCORE 3.58

    DB:3.58:Pof Serialization With Replicated Cache? m8


    Sorry again for the newbie question.

    Can you use POF serialized objects in a replicated cache?

    All of the examples show POF serialized objects being used with a partitioned cache.

    If you can do this, are there any caveats involved with the "replication" cache? I assume it would have to be started using the same configuration as the "master" cache.

    DB:3.58:Pof Serialization With Replicated Cache? m8

    Thanks Rob.

    So, you just start up the Coherence instance on the (or at the) replication site, using the same configuration as the "master"? (Of course with appropriate classpaths and such set correctly)

  • RELEVANCY SCORE 3.44

    DB:3.44:Deadlock While Creating A Topic In Messaging Pattern? md


    Hi,
    I've faced some kind of deadlock while trying to use messaging pattern

    I use coherence-messagingpattern-pof-cache-config.xml (I tried without POF serializers - no difference)

    in client side I use following code to establish a topic:

    messagingSession = DefaultMessagingSession.getInstance();
    topic = messagingSession.createTopic(topicName);when I have a single server on cluster it works fine, but when I have add one more cache server, this code freezes forever on creating topic (line 2). I run three jvms on same host, used versions coherence 3.4.2, messaging command pattern 2.4.0, incubator common 1.4.0.

    Following log appears on one of the cluster nodes

    ...
    2009-10-21 14:06:55.841/54.894 Oracle Coherence GE 3.4.2/411 D5 (thread=Cluster, member=2): Member 3 joined Service DistributedCacheForCommandPatternDistributedCommands with senior member 1
    2009-10-21 14:06:56.031/55.084 Oracle Coherence GE 3.4.2/411 Info (thread=DistributedCache:DistributedCacheForCommandPatternDistributedCommands, member=2): Loading POF configuration from resource "jar:file:/home/breeze/coherence/lib/ext/com.macys.breeze.coherence.config.jar!/messaging-pof-config.xml"
    2009-10-21 14:06:56.032/55.085 Oracle Coherence GE 3.4.2/411 Info (thread=DistributedCache:DistributedCacheForCommandPatternDistributedCommands, member=2): Loading POF configuration from resource "jar:file:/home/breeze/coherence/lib/coherence.jar!/coherence-pof-config.xml"
    2009-10-21 14:06:56.036/55.089 Oracle Coherence GE 3.4.2/411 Info (thread=DistributedCache:DistributedCacheForCommandPatternDistributedCommands, member=2): Loading POF configuration from resource "jar:file:/home/breeze/coherence/lib/ext/com.macys.breeze.coherence.config.jar!/coherence-common-pof-config.xml"
    2009-10-21 14:06:56.037/55.090 Oracle Coherence GE 3.4.2/411 Info (thread=DistributedCache:DistributedCacheForCommandPatternDistributedCommands, member=2): Loading POF configuration from resource "jar:file:/home/breeze/coherence/lib/ext/com.macys.breeze.coherence.config.jar!/coherence-commandpattern-pof-config.xml"
    2009-10-21 14:06:56.038/55.091 Oracle Coherence GE 3.4.2/411 Info (thread=DistributedCache:DistributedCacheForCommandPatternDistributedCommands, member=2): Loading POF configuration from resource "jar:file:/home/breeze/coherence/lib/ext/com.macys.breeze.coherence.config.jar!/coherence-messagingpattern-pof-config.xml"
    2009-10-21 14:06:56.067/55.120 Oracle Coherence GE 3.4.2/411 D5 (thread=Cluster, member=2): Member 3 joined Service DistributedCacheForCommandPattern with senior member 1
    2009-10-21 14:06:56.121/55.174 Oracle Coherence GE 3.4.2/411 D5 (thread=DistributedCacheForCommandPatternWorker:1, member=2): Context Identifier{product-updates-topic} has been inserted into this member
    2009-10-21 14:06:56.126/55.179 Oracle Coherence GE 3.4.2/411 D5 (thread=DistributedCacheForCommandPatternWorker:1, member=2): Creating CommandExecutor for Identifier{product-updates-topic}on the client node

    2009-10-21 14:06:55.879/20.394 Oracle Coherence GE 3.4.2/411 D5 (thread=DistributedCache:DistributedCacheForCommandPatternDistributedCommands, member=3): Service DistributedCacheForCommandPatternDistributedCommands joined the cluster with senior service member 1
    2009-10-21 14:06:55.888/20.403 Oracle Coherence GE 3.4.2/411 D5 (thread=DistributedCache:DistributedCacheForCommandPatternDistributedCommands, member=3): Service DistributedCacheForCommandPatternDistributedCommands: received ServiceConfigSync containing 258 entries
    2009-10-21 14:06:56.070/20.585 Oracle Coherence GE 3.4.2/411 D5 (thread=DistributedCache:DistributedCacheForCommandPattern, member=3): Service DistributedCacheForCommandPattern joined the cluster with senior service member 1
    2009-10-21 14:06:56.075/20.590 Oracle Coherence GE 3.4.2/411 D5 (thread=DistributedCache:DistributedCacheForCommandPattern, member=3): Service DistributedCacheForCommandPattern: received ServiceConfigSync containing 258 entrieswith step-debugging it works fine, so it seems like a kind of deadlock

    And I'm just stuck with no ideas where it can come from?

    ?:|

    DB:3.44:Deadlock While Creating A Topic In Messaging Pattern? md

    Hi,

    functional updates passed

    With thread dump and extended logs I found the cause of the issue - actually it was ExtendTcpService, which could not bind a port (I ran two cache servers on single machine, so they had to fight for IP) and blocked coherence service creation. I fixed tcp-acceptor configuration and MessagingService works now.

    Thanks!

  • RELEVANCY SCORE 3.36

    DB:3.36:.Net Coherence 3.7.0.0 And .Net Coherence 3.7.1.5 Difference 3s


    Hello,

    In the .NET application I am using following code for pof serialization:
    public static class SerializationPofHelper
    {
    private static readonly ConfigurablePofContext ConfigurablePofContext;
    public const string CoherencePofConfigFileName = "coherence-pof-config.xml";
    static SerializationPofHelper()
    {
    ConfigurablePofContext = new ConfigurablePofContext(CoherencePofConfigFileName);
    }
    public static byte[] SerializeT(T obj)
    {
    return SerializationHelper.ToBinary(obj, ConfigurablePofContext).ToByteArray();
    }
    }
    I have upgraded .NET Coherence from version 3.7.0.0 to version 3.7.1.5 but serialized object is different:
    .NET Coherence 3.7.0.0: 0x 911F00014E0132024E0132034E013140
    .NET Coherence 3.7.1.5: 0x15 911F00014E0132024E0132034E013140
    The difference is in first byte.

    Java ExternalizableHelper.toBinary serializs without 15 in first byte using Coherence 3.7.1.5:
    Java Coherence 3.7.0.0: 0x911F00014E0132024E0132034E013140

    Has anyone idea why the result is different in .NET Coherence 3.7.0.0 and .NET Coherence 3.7.1.5?

    Thank you,
    Yuriy Lazarenko

    DB:3.36:.Net Coherence 3.7.0.0 And .Net Coherence 3.7.1.5 Difference 3s

    This is interesting.

    1) Is this causing a problem, i.e. has something broken as a result?

    2) Does the object deserialize correctly, i.e. you've done toBinary() but what if you attempt to go in the other direction?

    Peace,

    Cameron Purdy | Oracle

  • RELEVANCY SCORE 3.29

    DB:3.29:Unknown User Type With Pof Serialization m9


    Hi all,I'm using 3.6 and am just starting to implement POF. In general it has been pretty easy but I seem to have a problem with my near scheme and POF. Things work ok in my unit tests, but it doesn't work when I deploy to a single instance of WebLogic 12 on my laptop. Here is an example scheme:
    near-scheme
    scheme-nameprod-near/scheme-name
    autostarttrue/autostart
    front-scheme
    local-scheme
    high-units{high-units 2000}/high-units
    expiry-delay{expiry-delay 2h}/expiry-delay
    /local-scheme
    /front-scheme
    back-scheme
    distributed-scheme
    backing-map-scheme
    local-scheme
    high-units{high-units 10000}/high-units
    expiry-delay{expiry-delay 2h}/expiry-delay
    /local-scheme
    /backing-map-scheme
    serializer
    instance
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    init-params
    init-param
    param-value/Bus/pof-config.xml/param-value
    param-valueString/param-value
    /init-param
    /init-params
    /instance
    /serializer
    /distributed-scheme
    /back-scheme
    /near-scheme

    I don't know if it matter, but some of my caches use another scheme that references this one as a parent:
    near-scheme
    scheme-namedaily-near/scheme-name
    scheme-refprod-near/scheme-ref
    autostarttrue/autostart
    back-scheme
    distributed-scheme
    backing-map-scheme
    local-scheme
    high-units system-property="daily-near-high-units"{high-units 10000}/high-units
    expiry-delay{expiry-delay 1d}/expiry-delay
    /local-scheme
    /backing-map-scheme
    serializer
    instance
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    init-params
    init-param
    param-value/Bus/pof-config.xml/param-value
    param-valueString/param-value
    /init-param
    /init-params
    /instance
    /serializer
    /distributed-scheme
    /back-scheme
    /near-scheme

    Those schemes have existed for years. I'm only now adding the serializers. I use this same cache config file in my unit tests, as well as the same pof config file. My unit tests do ExternalizableHelper.toBinary(o, pofContext) and ExternalizableHelper.fromBinary(b, pofContext). I create the test pof context by doing new ConfigurablePofContext("/Bus/pof-config.xml"). I've also tried actually putting and getting an object to and from a cache in my unit tests. Everything works as expected.My type definition looks like this:
    user-type
    type-id1016/type-id
    class-namecom.mycompany.mydepartment.bus.service.role.RoleResource/class-name
    /user-type

    I'm not using the tangosol.pof.enabled system property because I don't think it's necessary with the explicit serializers.Here is part of a stack trace:(Wrapped) java.io.IOException: unknown user type: com.mycompany.mydepartment.bus.service.role.RoleResource at com.tangosol.util.ExternalizableHelper.toBinary(ExternalizableHelper.java:214) at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ConverterValueToBinary.convert(PartitionedCache.CDB:3) at com.tangosol.util.ConverterCollections$ConverterCacheMap.put(ConverterCollections.java:2486) at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.put(PartitionedCache.CDB:1) at com.tangosol.coherence.component.util.SafeNamedCache.put(SafeNamedCache.CDB:1) at com.tangosol.net.cache.CachingMap.put(CachingMap.java:943) at com.tangosol.net.cache.CachingMap.put(CachingMap.java:902) at com.tangosol.net.cache.CachingMap.put(CachingMap.java:814)Any idea what I'm missing?ThanksJohn

    DB:3.29:Unknown User Type With Pof Serialization m9

    Hi all,I'm using 3.6 and am just starting to implement POF. In general it has been pretty easy but I seem to have a problem with my near scheme and POF. Things work ok in my unit tests, but it doesn't work when I deploy to a single instance of WebLogic 12 on my laptop. Here is an example scheme:
    near-scheme
    scheme-nameprod-near/scheme-name
    autostarttrue/autostart
    front-scheme
    local-scheme
    high-units{high-units 2000}/high-units
    expiry-delay{expiry-delay 2h}/expiry-delay
    /local-scheme
    /front-scheme
    back-scheme
    distributed-scheme
    backing-map-scheme
    local-scheme
    high-units{high-units 10000}/high-units
    expiry-delay{expiry-delay 2h}/expiry-delay
    /local-scheme
    /backing-map-scheme
    serializer
    instance
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    init-params
    init-param
    param-value/Bus/pof-config.xml/param-value
    param-valueString/param-value
    /init-param
    /init-params
    /instance
    /serializer
    /distributed-scheme
    /back-scheme
    /near-scheme

    I don't know if it matter, but some of my caches use another scheme that references this one as a parent:
    near-scheme
    scheme-namedaily-near/scheme-name
    scheme-refprod-near/scheme-ref
    autostarttrue/autostart
    back-scheme
    distributed-scheme
    backing-map-scheme
    local-scheme
    high-units system-property="daily-near-high-units"{high-units 10000}/high-units
    expiry-delay{expiry-delay 1d}/expiry-delay
    /local-scheme
    /backing-map-scheme
    serializer
    instance
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    init-params
    init-param
    param-value/Bus/pof-config.xml/param-value
    param-valueString/param-value
    /init-param
    /init-params
    /instance
    /serializer
    /distributed-scheme
    /back-scheme
    /near-scheme

    Those schemes have existed for years. I'm only now adding the serializers. I use this same cache config file in my unit tests, as well as the same pof config file. My unit tests do ExternalizableHelper.toBinary(o, pofContext) and ExternalizableHelper.fromBinary(b, pofContext). I create the test pof context by doing new ConfigurablePofContext("/Bus/pof-config.xml"). I've also tried actually putting and getting an object to and from a cache in my unit tests. Everything works as expected.My type definition looks like this:
    user-type
    type-id1016/type-id
    class-namecom.mycompany.mydepartment.bus.service.role.RoleResource/class-name
    /user-type

    I'm not using the tangosol.pof.enabled system property because I don't think it's necessary with the explicit serializers.Here is part of a stack trace:(Wrapped) java.io.IOException: unknown user type: com.mycompany.mydepartment.bus.service.role.RoleResource at com.tangosol.util.ExternalizableHelper.toBinary(ExternalizableHelper.java:214) at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ConverterValueToBinary.convert(PartitionedCache.CDB:3) at com.tangosol.util.ConverterCollections$ConverterCacheMap.put(ConverterCollections.java:2486) at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.put(PartitionedCache.CDB:1) at com.tangosol.coherence.component.util.SafeNamedCache.put(SafeNamedCache.CDB:1) at com.tangosol.net.cache.CachingMap.put(CachingMap.java:943) at com.tangosol.net.cache.CachingMap.put(CachingMap.java:902) at com.tangosol.net.cache.CachingMap.put(CachingMap.java:814)Any idea what I'm missing?ThanksJohn

  • RELEVANCY SCORE 3.27

    DB:3.27:Binary Decoration In Extend Client pj


    Is it possible to somehow intercept a simple put() operation at the client and to decorate the Binary objects produced by POF serialization? The goal here is to pass metadata that is not a part of the cached state and yet is available to be processed by e.g. MapTrigger or BinaryEntryStore implementations.

    DB:3.27:Binary Decoration In Extend Client pj

    spark wrote:
    Is it possible to somehow intercept a simple put() operation at the client and to decorate the Binary objects produced by POF serialization? The goal here is to pass metadata that is not a part of the cached state and yet is available to be processed by e.g. MapTrigger or BinaryEntryStore implementations.Hi spark,

    I don't remember off my head the format of the decorated binary, if it is possible to write a decoration without knowing the size of the binary you want to decorate. If that is possible, then it would be possible with a custom Serializer (not PofSerializer) writing the decoration header before delegating to the original Serializer.

    If you use POF and 3.6+ and you want to "decorate" only the value, you should probably write decoration data into the normal PofWriter and create a nested POF writer to write the to-be-decorated object afterwards.

    Best regards,

    Robert

  • RELEVANCY SCORE 3.26

    DB:3.26:Why We Need Java Class In C++ Pof Serialization f9


    Hi,
    I'm really confused why we need java class which implements PortableObject to support complex objects of c++. If we are not using any queries or entry processors in the application can't we keep the object as serialized byte format and can't we retrieve in from the c++ deserialization.

    Please share your thoughts if there's a way if we can skip any Java implementation.

    regards,
    Sura

    DB:3.26:Why We Need Java Class In C++ Pof Serialization f9

    Hi,
    I'm really confused why we need java class which implements PortableObject to support complex objects of c++. If we are not using any queries or entry processors in the application can't we keep the object as serialized byte format and can't we retrieve in from the c++ deserialization.

    Please share your thoughts if there's a way if we can skip any Java implementation.

    regards,
    Sura

  • RELEVANCY SCORE 3.21

    DB:3.21:Pof Serialization Err Unknown User Type:Com.Tangosol.Run.Xml.Simpleelement 38


    I am trying to use POF serialization. I am running with the following configuration...

    I have set the following properties for coherence server node when it starts up:

    tangosol.coherence.distributed.localstorage := true
    tangosol.pof.config := carbon-pof-config.xml
    tangosol.pof.enabled := true
    tangosol.coherence.member:=AT01073-4488-1238199841069

    My dataobjects extend AbstractEvolvable and implement the PortableObject interface.....

    It can be seen from the log that coherence picks up the pof configuration from the correct pof-config file...
    Loading POF configuration from resource "jar:file:/D:/Source/carbon-data/target/carbon-data-1.00.0.000-SNAPSHOT.jar!/carbon-pof-config.xml"

    my pof config file is:

    ?xml version="1.0" encoding="UTF-8"?
    !DOCTYPE pof-config SYSTEM "pof-config.dtd"
    pof-config
    user-type-list
    user-type
    type-id1/type-id
    class-nameDReferenceDataObjectImpl/class-name
    /user-type
    user-type
    type-id2/type-id
    class-nameDAttributeOptionImpl/class-name
    /user-type
    user-type
    type-id3/type-id
    class-nameDProgramAttributeImpl/class-name
    /user-type
    user-type
    type-id4/type-id
    class-nameDProgramImpl/class-name
    /user-type
    user-type
    type-id5/type-id
    class-nameDAttributeOptionJoinImpl/class-name
    /user-type
    /user-type-list
    allow-subclassestrue/allow-subclasses
    /pof-config

    relevant section of my cache configuration is :

    distributed-scheme
    scheme-namedefault-distributed/scheme-name
    service-nameDistributedCache/service-name
    backing-map-scheme
    class-scheme
    scheme-refdefault-backing-map/scheme-ref
    /class-scheme
    /backing-map-scheme
    serializer
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    /serializer
    /distributed-scheme

    On the weblogic server instance where the data is being put in the cache no errors are being reported. (it shares identical configuration except local storage is disabled for distributed cache).
    However the following exception is thrown on the server side.

    java.lang.IllegalArgumentException: unknown user type: com.tangosol.run.xml.SimpleElement
    at com.tangosol.io.pof.ConfigurablePofContext.getUserTypeIdentifier(ConfigurablePofContext.java:400)
    at com.tangosol.io.pof.ConfigurablePofContext.getUserTypeIdentifier(ConfigurablePofContext.java:389)
    at com.tangosol.io.pof.PofBufferWriter.writeObject(PofBufferWriter.java:1432)
    at com.tangosol.io.pof.ConfigurablePofContext.serialize(ConfigurablePofContext.java:338)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.writeObject(Service.CDB:4)
    at com.tangosol.coherence.component.net.Message.writeObject(Message.CDB:1)
    at com.tangosol.coherence.component.net.message.DistributedCacheResponse.write(DistributedCacheResponse.CDB:2)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.packetProcessor.PacketPublisher.packetizeMessage(PacketPublisher.CDB:137)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.packetProcessor.PacketPublisher$InQueue.add(PacketPublisher.CDB:8)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.dispatchMessage(Grid.CDB:50)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.post(Grid.CDB:53)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.DistributedCache$StorageIdRequest.onReceived(DistributedCache.CDB:47)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onMessage(Grid.CDB:9)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:130)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.DistributedCache.onNotify(DistributedCache.CDB:3)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:37)
    at java.lang.Thread.run(Thread.java:619)

    Any help would be greatly appreciated.

    Thanks.
    Shamsur

    DB:3.21:Pof Serialization Err Unknown User Type:Com.Tangosol.Run.Xml.Simpleelement 38

    SR-APX wrote:
    Aleks

    thanks for your response.

    However the include property needs to present inside the user-type-list tag....

    user-type-list includecoherence-pof-config.xml /include/user-type-list

    For all other interested users the following property, tangosol.pof.enabled=true , must also be set for pof serialization to work correctly.

    Thanks again...

    ShamsurHi Shamsur,

    it is not mandatory to use tangosol.pof.enabled=true, you can alternatively specify the serializer for the clustered services to be configured for POF (not necessarily all of them) on a service-by-service basis in the cache configuration file explicitly with the following element.

    serializercom.tangosol.io.pof.ConfigurablePofContext/serializerBest regards,

    Robert

  • RELEVANCY SCORE 3.15

    DB:3.15:Pof Serialization For Exceptions Thrown 99


    If any Java Exceptions are being thrown and this needs to be shown to clients in some proper way.
    How to make these Java Exceptions to be POF enable ?
    Is com.tangosol.io.pof.ThrowablePofSerializer is meant for this ?

    I can have my custom Exception class and have Portable Interface implemented but i wanted to know if their is some way to enable available java excpetions to
    make POF enable ex: java.util.ConcurrentModificationException

    Edited by: Khangharoth on Jul 19, 2009 10:40 PM

    DB:3.15:Pof Serialization For Exceptions Thrown 99

    Is com.tangosol.io.pof.ThrowablePofSerializer is meant for this ? : Yes and then we can have any Java thowable Pof enabled.
    But a catch here "Any deserialized exception will loose type information, and simply be represented as a PortableException".

  • RELEVANCY SCORE 3.13

    DB:3.13:Repositoryexception During Serialization s8



    Hi,

    we are using CQ5.4 and in one of our environments I see that both replications queues are active but have over 2k of pending items (for activation and deactivation)

    One the replication logs shows a lot of the following error:

    ERROR - agent2 : Error while building replication content com.day.cq.replication.ReplicationException: RepositoryException during serialization

    The other queue log does not any errors.

    Any idea what could have caused this,

    Thanks,

    Lior

    DB:3.13:Repositoryexception During Serialization s8


    Thanks for the feedback. we did a cq restart and the Qs started processing .

    Lior

  • RELEVANCY SCORE 3.12

    DB:3.12:Re: Implement Pofserializer, Serialize/Deserialize Not Invoked da


    Hi,

    just had a quick look, but maybe your POF config is wrong. The name of the class you want to serialize in POF format is 'Item', while the name in you POF config is 'CardCacheItem'.
    Make sure that the full class name in the POF correct (i.e. your Item class belongs to package com.betfair.site.coherence.entities).

    Also, make sure your POF config is being used by your client as well as your server.

    Please, shout if this did not help.

    Thanks

    DB:3.12:Re: Implement Pofserializer, Serialize/Deserialize Not Invoked da

    user11200171 wrote:
    I forgot to mention that I use a replicated-scheme and test on my local machine. Could this be a problem?Probably. Are you using only a single cluster node? In that case replicated caches do not serialize the cached values. Try starting another cluster node and you should see a difference.

    Best regards,

    Robert

  • RELEVANCY SCORE 3.06

    DB:3.06:Testing .Net Pof Serialization Without Starting Cache Server fa


    We want to test POF serialisation from .NET without needing to start a cache server. We have many POF classes and wish to run unit tests of POF fidelity.

    We can do this on the Java side as follows:

    MyPOFClass example = new MyPOFClass();

    ConfigurablePofContext pofContext = new ConfigurablePofContext("my-pof-config.xml");
    Binary binary = ExternalizableHelper.toBinary(example, pofContext);
    ...
    ExternalizableHelper.fromBinary(binary, pofContext);

    How can this be done using the .NET API?

    - No ExternalizableHelper exists for .NET (reasonable since .NET has no ExternalizableLite support)
    - POFHelper exists but lacks to/from Binary methods.
    - Calling the read/writeExternal() methods directly requires providing objects implementing IPofReader and IPofWriter. It is unclear how to do this.

    Thanks,
    phil wheeler

  • RELEVANCY SCORE 3.04

    DB:3.04:Oracle Coherence Issue da


    Hi, I am getting the below error while trying to run the coherence cache-server.cmd. Please help me in getting this issue resolve. I have the config files in the correct location.

    Exception in thread "main" (Wrapped: Failed to load the factory) (Wrapped: Missi
    ng or inaccessible constructor "com.tangosol.net.DefaultConfigurableCacheFactory
    (String)"
    configurable-cache-factory-config
    class-namecom.tangosol.net.DefaultConfigurableCacheFactory/class-name
    init-params
    init-param
    param-typejava.lang.String/param-type
    param-valueC:\Coherence\employee-cache-config.xml;C:\Coherence\employee-
    pof-config.xml;/param-value
    /init-param
    /init-params
    /configurable-cache-factory-config) java.lang.reflect.InvocationTargetExceptio
    n
    at com.tangosol.util.Base.ensureRuntimeException(Base.java:288)
    at com.tangosol.net.ScopedCacheFactoryBuilder.getDefaultFactory(ScopedCa
    cheFactoryBuilder.java:311)
    at com.tangosol.net.DefaultCacheFactoryBuilder.getSingletonFactory(Defau
    ltCacheFactoryBuilder.java:48)
    at com.tangosol.net.DefaultCacheFactoryBuilder.getFactory(DefaultCacheFa
    ctoryBuilder.java:121)
    at com.tangosol.net.ScopedCacheFactoryBuilder.getConfigurableCacheFactor
    y(ScopedCacheFactoryBuilder.java:112)
    at com.tangosol.net.CacheFactory.getConfigurableCacheFactory(CacheFactor
    y.java:126)
    at com.tangosol.net.DefaultCacheServer.getDefaultConfigurableCacheFactor
    y(DefaultCacheServer.java:364)
    at com.tangosol.net.DefaultCacheServer.main(DefaultCacheServer.java:197)

    Caused by: (Wrapped: Missing or inaccessible constructor "com.tangosol.net.Defau
    ltConfigurableCacheFactory(String)"
    configurable-cache-factory-config
    class-namecom.tangosol.net.DefaultConfigurableCacheFactory/class-name
    init-params
    init-param
    param-typejava.lang.String/param-type
    param-valueC:\Coherence\employee-cache-config.xml;C:\Coherence\employee-
    pof-config.xml;/param-value
    /init-param
    /init-params
    /configurable-cache-factory-config) java.lang.reflect.InvocationTargetExceptio
    n
    at com.tangosol.util.Base.ensureRuntimeException(Base.java:288)
    at com.tangosol.run.xml.XmlHelper.createInstance(XmlHelper.java:2652)
    at com.tangosol.run.xml.XmlHelper.createInstance(XmlHelper.java:2536)
    at com.tangosol.net.ScopedCacheFactoryBuilder.getDefaultFactory(ScopedCa
    cheFactoryBuilder.java:273)
    ... 6 more
    Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct
    orAccessorImpl.java:39)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC
    onstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at com.tangosol.util.ClassHelper.newInstance(ClassHelper.java:694)
    at com.tangosol.run.xml.XmlHelper.createInstance(XmlHelper.java:2611)
    ... 8 more
    Caused by: (Wrapped: Failed to load cache configuration: C:\Coherence\employee-c
    ache-config.xml;C:\Coherence\employee-pof-config.xml;) java.io.IOException: The
    cache configuration is missing: "C:\Coherence\employee-cache-config.xml;C:\Coher
    ence\employee-pof-config.xml;", loader=null
    at com.tangosol.util.Base.ensureRuntimeException(Base.java:288)
    at com.tangosol.run.xml.XmlHelper.loadResourceInternal(XmlHelper.java:34
    1)
    at com.tangosol.run.xml.XmlHelper.loadFileOrResource(XmlHelper.java:283)

    at com.tangosol.net.DefaultConfigurableCacheFactory.loadConfig(DefaultCo
    nfigurableCacheFactory.java:439)
    at com.tangosol.net.DefaultConfigurableCacheFactory.loadConfig(DefaultCo
    nfigurableCacheFactory.java:425)
    at com.tangosol.net.DefaultConfigurableCacheFactory.init(DefaultConfig
    urableCacheFactory.java:155)
    ... 14 more
    Caused by: java.io.IOException: The cache configuration is missing: "C:\Coherenc
    e\employee-cache-config.xml;C:\Coherence\employee-pof-config.xml;", loader=null
    at com.tangosol.run.xml.XmlHelper.loadResourceInternal(XmlHelper.java:31
    8)
    ... 18 more
    2012-12-26 18:08:25.628/0.468 Oracle Coherence 3.7.1.0 Info (thread=main, memb
    er=n/a): Loaded operational configuration from "jar:file:/C:/Coherence/lib/coher
    ence.jar!/tangosol-coherence.xml"
    2012-12-26 18:08:25.878/0.718 Oracle Coherence 3.7.1.0 Info (thread=main, memb
    er=n/a): Loaded operational overrides from "jar:file:/C:/Coherence/lib/coherence
    .jar!/tangosol-coherence-override-dev.xml"
    2012-12-26 18:08:25.878/0.718 Oracle Coherence 3.7.1.0 D5 (thread=main, member
    =n/a): Optional configuration override "/tangosol-coherence-override.xml" is not
    specified
    2012-12-26 18:08:25.878/0.718 Oracle Coherence 3.7.1.0 D5 (thread=main, member
    =n/a): Optional configuration override "/custom-mbeans.xml" is not specified

    Oracle Coherence Version 3.7.1.0 Build 27797
    Grid Edition: Development mode
    Copyright (c) 2000, 2011, Oracle and/or its affiliates. All rights reserved.

    Regards,
    CG

    DB:3.04:Oracle Coherence Issue da

    Hi,

    What you are doing is wrong is marked in bold:

    configurable-cache-factory-config
    class-namecom.tangosol.net.DefaultConfigurableCacheFactory/class-name
    init-params
    init-param
    param-typejava.lang.String/param-type
    *param-valueC:\Coherence\employee-cache-config.xml;C:\Coherence\employee-*
    pof-config.xml;/param-value
    /init-param
    /init-params
    /configurable-cache-factory-config

    The configurable-cache-factory element is used to load the coherence-cache-config.xml only and your pof configuration. For you pof-configuration, you need to define in your serialization tag for each cache or all caches within coherence-cache-config.xml as below:

    serializer
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    init-params
    init-param
    param-valueC:\Coherence\employee-pof-config.xml/param-value
    param-typeString/param-type
    /init-param
    /init-params
    /serializer

    HTH

    Cheers,
    _NJ

  • RELEVANCY SCORE 3.01

    DB:3.01:Java.Lang.Classcastexception: Com.Tangosol.Io.Pof.Portableexception 1m


    Hi,

    I'm using coherence 3.4.2 and Coherence for .NET. I've managed to get my .NET client talking to the coherence cluster. HOwever, now I want to get a java client to do the same. (I want to connect to an extend node)

    The Java client throws this exception

    ************
    2009-04-23 13:57:01.534/6.500 Oracle Coherence 3.4.2/411 Info (thread=main, member=n/a): Loaded operational configuration from resource "jar:file:/C:/opt/Oracle/coherence/lib/coherence.jar!/tangosol-coherence.xml"
    2009-04-23 13:57:01.550/6.516 Oracle Coherence 3.4.2/411 Info (thread=main, member=n/a): Loaded operational overrides from resource "jar:file:/C:/opt/Oracle/coherence/lib/coherence.jar!/tangosol-coherence-override-dev.xml"
    2009-04-23 13:57:01.550/6.516 Oracle Coherence 3.4.2/411 D5 (thread=main, member=n/a): Optional configuration override "/tangosol-coherence-override.xml" is not specified
    2009-04-23 13:57:01.565/6.531 Oracle Coherence 3.4.2/411 D5 (thread=main, member=n/a): Optional configuration override "/custom-mbeans.xml" is not specified

    Oracle Coherence Version 3.4.2/411
    Grid Edition: Development mode
    Copyright (c) 2000-2009 Oracle. All rights reserved.

    2009-04-23 13:57:02.128/7.094 Oracle Coherence GE 3.4.2/411 Info (thread=main, member=n/a): Loaded cache configuration from file "H:\pradhan\Java\Coherence\config\cache-extend-config.xml"
    2009-04-23 13:57:02.597/7.563 Oracle Coherence GE 3.4.2/411 D5 (thread=ExtendTcpCacheService:TcpInitiator, member=n/a): Started: TcpInitiator{Name=ExtendTcpCacheService:TcpInitiator, State=(SERVICE_STARTED), ThreadCount=0, Codec=Codec(Format=POF), PingInterval=0, PingTimeout=5000, RequestTimeout=5000, ConnectTimeout=5000, RemoteAddresses=[spmbs008/172.21.194.185:9099,spmbs006/172.21.194.186:9099], KeepAliveEnabled=true, TcpDelayEnabled=false, ReceiveBufferSize=0, SendBufferSize=0, LingerTimeout=-1}
    2009-04-23 13:57:02.597/7.563 Oracle Coherence GE 3.4.2/411 D5 (thread=main, member=n/a): Opening Socket connection to 172.21.194.185:9099
    2009-04-23 13:57:02.612/7.578 Oracle Coherence GE 3.4.2/411 Info (thread=main, member=n/a): Connected to 172.21.194.185:9099
    2009-04-23 13:57:02.659/7.625 Oracle Coherence GE 3.4.2/411 D5 (thread=ExtendTcpCacheService:TcpInitiator, member=n/a): Stopped: TcpInitiator{Name=ExtendTcpCacheService:TcpInitiator, State=(SERVICE_STOPPED), ThreadCount=0, Codec=Codec(Format=POF), PingInterval=0, PingTimeout=5000, RequestTimeout=5000, ConnectTimeout=5000, RemoteAddresses=[spmbs008/172.21.194.185:9099,spmbs006/172.21.194.186:9099], KeepAliveEnabled=true, TcpDelayEnabled=false, ReceiveBufferSize=0, SendBufferSize=0, LingerTimeout=-1}
    2009-04-23 13:57:02.675/7.641 Oracle Coherence GE 3.4.2/411 Error (thread=main, member=n/a): Error while starting service "ExtendTcpCacheService": com.tangosol.net.messaging.ConnectionException
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.peer.initiator.TcpInitiator$TcpConnection$TcpReader.onNotify(TcpInitiator.CDB:46)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:37)
    at java.lang.Thread.run(Unknown Source)
    Caused by: java.io.EOFException
    at java.io.DataInputStream.readUnsignedByte(Unknown Source)
    at com.tangosol.util.ExternalizableHelper.readInt(ExternalizableHelper.java:493)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.peer.initiator.TcpInitiator$TcpConnection$TcpReader.onNotify(TcpInitiator.CDB:20)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:37)
    at java.lang.Thread.run(Unknown Source)
    ************

    and when I look at the extend node on the cluster, I see this

    ************
    04/23/09 13:56:38.899 INFO: [DiagnosticsPlugin] [M: 73M/494M/494M] [T: D(2) O(54)] [5.1.0.21] [JRE: 1.5.0_08/Sun Microsystems Inc.] [OS: Windows 2003/5.2/x86] [H: 172.21.194.185]
    2009-04-23 13:57:02.335/4529.301 Oracle Coherence GE 3.4.2/411 Error (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=2): An exception occurred while decoding a Message for Service=Proxy:ExtendTcpProxyService:TcpAcceptor received from: TcpConnection(Id=null, Open=true, LocalAddress=172.21.194.185:9099, RemoteAddress=11.176.203.168:2585): java.lang.ClassCastException: com.tangosol.io.pof.PortableException
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer$MessageFactory$OpenConnectionRequest.readExternal(Peer.CDB:6)
    at com.tangosol.coherence.component.net.extend.Codec.decode(Codec.CDB:29)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer.decodeMessage(Peer.CDB:25)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer.onNotify(Peer.CDB:47)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:37)
    at java.lang.Thread.run(Thread.java:595)
    ************

    I'm wondering what I'm doing wrong.

    On my cluster, I have this defined (snippet of cache-extend-config.xml)

    ************
    serializer
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    init-params
    init-param
    param-typestring/param-type
    param-valuecustom-types-pof-config.xml/param-value
    /init-param
    /init-params
    /serializer
    ************

    and

    ************
    ?xml version="1.0"?
    !DOCTYPE pof-config SYSTEM "pof-config.dtd"
    pof-config
    user-type-list
    !-- include all "standard" Coherence POF user types --
    includeexample-pof-config.xml/include
    /user-type-list
    /pof-config
    ************

    On my client, I have the following

    -Dtangosol.coherence.cacheconfig=./config/cache-extend-config.xml -Dtangosol.pof.config=./config/pof-config.xml

    ************
    ?xml version="1.0"?
    !DOCTYPE pof-config SYSTEM "pof-config.dtd"
    pof-config
    user-type-list
    includeexample-pof-config.xml/include
    /user-type-list
    /pof-config
    ************

    ************
    ?xml version="1.0"?
    cache-config xmlns="http://schemas.tangosol.com/cache"
    caching-scheme-mapping
    cache-mapping
    cache-name*/cache-name
    scheme-namedistributed-cache/scheme-name
    /cache-mapping
    /caching-scheme-mapping

    caching-schemes
    remote-cache-scheme
    scheme-namedistributed-cache/scheme-name
    service-nameExtendTcpCacheService/service-name
    initiator-config
    tcp-initiator
    remote-addresses
    socket-address
    addressspmbs008/address
    port9099/port
    /socket-address
    socket-address
    addressspmbs006/address
    port9099/port
    /socket-address
    /remote-addresses
    /tcp-initiator

    outgoing-message-handler
    request-timeout5s/request-timeout
    /outgoing-message-handler
    /initiator-config
    serializer
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    init-params
    init-param
    param-typeString/param-type
    param-valuepof-config.xml/param-value
    /init-param
    /init-params
    /serializer
    /remote-cache-scheme
    /caching-schemes
    /cache-config
    ************

    Not sure if i'm doing something wrong. I just want to be able to talk to the same cache via Java and .NET

  • RELEVANCY SCORE 3.01

    DB:3.01:An Exception Occurred Instantiating A Portableobject User Type From A Pof x8


    HI,

    while solving coherence tutorial I have come across following error

    Exception in thread "Main Thread" (Wrapped) java.io.IOException: An exception occurred instantiating a PortableObject user type from a POF stream: type-id=1002, class-name=com.oracle.handson.Address, exception=
    java.lang.InstantiationException: com.oracle.handson.Address
    at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:266)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ConverterFromBinary.convert(PartitionedCache.CDB:4)
    at com.tangosol.util.ConverterCollections$ConverterMap.put(ConverterCollections.java:1578)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.put(PartitionedCache.CDB:1)
    at com.tangosol.coherence.component.util.SafeNamedCache.put(SafeNamedCache.CDB:1)
    at com.oracle.handson.ContactDriver.main(ContactDriver.java:26)
    Caused by: java.io.IOException: An exception occurred instantiating a PortableObject user type from a POF stream: type-id=1002, class-name=com.oracle.handson.Address, exception=
    java.lang.InstantiationException: com.oracle.handson.Address
    at com.tangosol.io.pof.PortableObjectSerializer.deserialize(PortableObjectSerializer.java:122)
    at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3307)
    at com.tangosol.io.pof.PofBufferReader.readObject(PofBufferReader.java:2604)
    at com.tangosol.io.pof.ConfigurablePofContext.deserialize(ConfigurablePofContext.java:342)
    at com.tangosol.util.ExternalizableHelper.deserializeInternal(ExternalizableHelper.java:2708)
    at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:262)
    ... 5 more
    Process exited with exit code 1.

    Please suggest a solution

    DB:3.01:An Exception Occurred Instantiating A Portableobject User Type From A Pof x8

    Hi,

    The error youre getting is probably caused by not having default constructor in the Account class, check if thats true.

    Regards,

    Ivan

  • RELEVANCY SCORE 2.98

    DB:2.98:Getting Schema Validation Working In Eclipse With Coherence 3.7.1.0 1j


    Just wondered if anyone had got schema validation to work in Eclipse (3.5 - Galileo) for Coherence 3.7.1.0?

    The Coherence developer docs show that you should add sections like this:

    pof-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns="http://xmlns.oracle.com/coherence/coherence-pof-config"
    xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-pof-config
    coherence-pof-config.xsd"

    However, if I use that "shortened" form (for cache, pof, etc. configs) Eclipse gives a warning "No grammar constraints (DTD or XML schema) detected for the document." and the schema validation fails to work (i.e. no "auto pop-ups" when entering content, and rubbish content is gladly accepted.)

    In Coherence 3.7.0, I'd used the following "extended" form (note the longer "schemaLocation") to get things working correctly:

    pof-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns="http://xmlns.oracle.com/coherence/coherence-pof-config"
    xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-pof-config
    http://xmlns.oracle.com/coherence/coherence-pof-config/1.0/coherence-pof-config.xsd"

    Also note that entering the "http://xmlns.oracle.com/coherence/coherence-pof-config/1.0/coherence-pof-config.xsd" in a web browser opens the xsd, as you'd expect.

    Now...

    I'm looking at the PofAnnotationSerializer in 3.7.1.0 and the "auto indexing" option. The declaration in my pof file for it fails as the "class" (or fully-qualified java.lang.Class version) in the "init-params" section isn't valid. If I look at the xsd on the url above, this is indeed the case, that option does not appear.

    However, if I look at the POF xsd in the coherence.jar file for 3.7.1.0, the "class" option has been added, and has a newer 'version="1.1"' added to it's schema declaration. So I therefore tried to point my "extended" declaration to point to " http://xmlns.oracle.com/coherence/coherence-pof-config/1.1/coherence-pof-config.xsd", in order to get schema validation to work in Eclipse with this new schema. Unfortunately that url doesn't exist - you get a "Content Server Request Failed" error.

    So, I guess my question is, is there a way to get myself pointed at the 1.1 versions of the xsd's so I can have schema validation working in Ecliipse? Or is there another workaround (did a bit of Googling, but that mainly seemed to be people switching validation off to simply get rid of the error...)

    Cheers,

    Steve

  • RELEVANCY SCORE 2.98

    DB:2.98:Re: Storing Large Datetime Values From .Net ss


    Hi Timur,

    Would it be possible for you to post:

    (1) a stack trace
    (2) .NET and Java code for their POF data types
    (3) configuration files (cache, POF) for java and .net

    Also some further clarifications could possibly be helpful:

    Are the working serialization in Java and the "broken" one in .NET via POF are 2 different caches/tests?

    Where is the exception happening (and what is it) is it in POFing the .NET DateTime object?
    or is it de-POFing it on the server side?

    Why are you not just using null for DateTime fields instead of a special date?

    cheers,

    -David Leibs

    DB:2.98:Re: Storing Large Datetime Values From .Net ss

    You may want to consider transporting your special value 1/1/9999 as null on the wire by simply testing and converting during serialization and then back to 1/1/9999 during deserialization. This will have no impact on application logic and also be very compact on the wire.

    Happy Null Year ;)

    Mark

  • RELEVANCY SCORE 2.97

    DB:2.97:Owa Rules Engines Throws An Exception. What To Do? k9


    OWA Rules Engines throws the following exception: The formatter threw an exception while trying to deserialize the message: There was an error while trying to deserialize parameter ECP:sort. The InnerException message was 'An internal error has occurred. 'System.String' is not assignable from 'System.Int32' - error generating code for serialization.'. Please see InnerException for more details.What to do?ThanksJoerg

    DB:2.97:Owa Rules Engines Throws An Exception. What To Do? k9

    Did anybody try this?http://simplovation.com/forums/t/95eac72d-c5a5-4607-a7db-165b4733410e.aspxSrdjan

  • RELEVANCY SCORE 2.96

    DB:2.96:Using Set Directly As Value Of Pof Cache xc


    All,I'm a long-time Coherence user but just started using Pof. I'm on 3.6. I'm getting this exception:Unhandled Exception: java.lang.ClassCastException: com.tangosol.util.ImmutableArrayList$ListView cannot be cast to java.util.SetThe set is the value in my cache. It is not an attribute of the value. I've seen other posts about how to handle that in my own serialization code, but not when I'm getting it directly like I am now. The implementation type is just an ordinary HashSet and the contents are uniform, if that matters.As a workaround I think I can do new HashSet().addAll(cache.get(whatever)) but it seems this should not be necessary.thanks

    DB:2.96:Using Set Directly As Value Of Pof Cache xc

    John -Normally, when deserializing, if no need it to be a HashSet, you'd pass a "new HashSet()" to PofReader.readCollection().Peace,Cameron | Oracle

  • RELEVANCY SCORE 2.96

    DB:2.96:Serialization With Wait z7


    Hi,

    I am using dbms_sleep() while implementing serialization.
    This returns error message for a session, while another session is in process
    declare
    *
    ERROR at line 1:
    ORA-20001: Timed Out - As another session is Running the application
    ORA-06512: at line 7

    I dont want error message to be displayed,while application is waiting to acquire the lock.

    Could you please help me?

    Thank You,
    -Aparna

    DB:2.96:Serialization With Wait z7

    Note the name of this forum is "SQL Developer *(Not for general SQL/PLSQL questions)*" (so only for issues with the SQL Developer tool). Please post these questions under the dedicated SQL And PL/SQL forum (you've posted there before).

    Regards,
    K.

  • RELEVANCY SCORE 2.91

    DB:2.91:Need Help Configuring For Pof j1


    I am trying to use POF to serialize one specific named cache only. The client nodes are configured for near caches with no local storage. I ran into a problem where I got error log complaints that another node in the cluster was not configured for POF serialization for the DistributedCache service. So, I created a new service PofDistributedCache service for use by the pof cache. That changed my errors but didn't get me very far.

    Q1: If I have mixed pof / non-pof caches, to they need to use different DistributedCache services?
    Q2: Does the server (back-cache) also need a serializer block?
    Q3: Does the server need all the object classes and the classes needed to (de)serialize the objects?

    --Larkin

    Client side coherence-cache-config.html:

    ?xml version="1.0" encoding="UTF-8"?
    !DOCTYPE cache-config SYSTEM "cache-config.dtd"

    cache-config

    caching-scheme-mapping
    cache-mapping
    cache-namepof-*/cache-name
    scheme-namedefault-near-pof/scheme-name
    init-params
    init-param-namefront-size-limit/init-param-name
    init-param-value system-property=foo.coherence.default.front-size-limit"0/init-param-value
    /init-params
    /cache-mapping

    cache-mapping
    cache-name*/cache-name
    scheme-namedefault-near/scheme-name
    init-params
    init-param-namefront-size-limit/init-param-name
    init-param-value system-property="foo.coherence.default.front-size-limit"0/init-param-value
    /init-params
    /cache-mapping
    /caching-scheme-mapping

    caching-schemes
    near-scheme
    scheme-namedefault-near/scheme-name
    front-scheme
    local-scheme
    scheme-refdefault-local/scheme-ref
    /local-scheme
    /front-scheme
    back-scheme
    distributed-scheme
    scheme-refdefault-distributed/scheme-ref
    /distributed-scheme
    /back-scheme
    /near-scheme

    near-scheme
    scheme-namedefault-near-pof/scheme-name
    front-scheme
    local-scheme
    scheme-refdefault-local/scheme-ref
    /local-scheme
    /front-scheme
    back-scheme
    distributed-scheme
    scheme-refdefault-distributed-pof/scheme-ref
    /distributed-scheme
    /back-scheme
    /near-scheme

    local-scheme
    scheme-namedefault-local/scheme-name
    high-units{front-size-limit 0}/high-units
    /local-scheme

    !--
    This config file is for client use only. The back-cache will not
    provide any local storage to the cluster.
    --
    distributed-scheme
    scheme-namedefault-distributed/scheme-name
    service-nameDistributedCache/service-name
    local-storage${coherence.back-cache.storage}/local-storage
    backing-map-scheme
    local-scheme
    scheme-refdefault-local/scheme-ref
    /local-scheme
    /backing-map-scheme
    /distributed-scheme

    distributed-scheme
    scheme-namedefault-distributed-pof/scheme-name
    service-namePofDistributedCache/service-name
    local-storage${coherence.back-cache.storage}/local-storage
    backing-map-scheme
    local-scheme
    scheme-refdefault-local/scheme-ref
    /local-scheme
    /backing-map-scheme
    serializer
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    /serializer
    /distributed-scheme
    /caching-schemes

    /cache-config

    Server side coherence-cache-config.xml

    ?xml version="1.0" encoding="UTF-8"?
    !DOCTYPE cache-config SYSTEM "cache-config.dtd"

    cache-config

    caching-scheme-mapping
    cache-mapping
    cache-namepof-*/cache-name
    scheme-namedefault-distributed-pof/scheme-name
    /cache-mapping
    cache-mapping
    cache-name*/cache-name
    scheme-namedefault-distributed/scheme-name
    /cache-mapping
    /caching-scheme-mapping

    caching-schemes

    distributed-scheme
    scheme-namedefault-distributed/scheme-name
    service-nameDistributedCache/service-name
    backing-map-scheme
    local-scheme
    scheme-refdefault-local/scheme-ref
    /local-scheme
    /backing-map-scheme
    autostarttrue/autostart
    /distributed-scheme
    distributed-scheme
    scheme-namedefault-distributed-pof/scheme-name
    service-namePofDistributedCache/service-name
    backing-map-scheme
    local-scheme
    scheme-refdefault-local/scheme-ref
    /local-scheme
    /backing-map-scheme
    autostarttrue/autostart
    /distributed-scheme

    local-scheme
    unit-calculatorBINARY/unit-calculator
    scheme-namedefault-local/scheme-name
    /local-scheme

    /caching-schemes

    /cache-config

    DB:2.91:Need Help Configuring For Pof j1

    Hi Larkin,

    llowrey wrote:
    I am trying to use POF to serialize one specific named cache only. The client nodes are configured for near caches with no local storage. I ran into a problem where I got error log complaints that another node in the cluster was not configured for POF serialization for the DistributedCache service. So, I created a new service PofDistributedCache service for use by the pof cache. That changed my errors but didn't get me very far.

    Q1: If I have mixed pof / non-pof caches, to they need to use different DistributedCache services?Yes. You can control POF/old-style-serialization on a service by service basis only.

    Q2: Does the server (back-cache) also need a serializer block?It is not relevant on near-cache. The scheme defining the back cache (and invocation services and replicated cache schemes) need to have the serializer specified.

    Q3: Does the server need all the object classes and the classes needed to (de)serialize the objects?If you want to deserialize the objects, then certainly they do. But with POF you don't necessarily need to deserialize entries from partitioned caches to define indexes or run entry-processors/aggregations on them. You can leverage PofExtractor-s and PofNavigator-s to do all your server-side logic, although for complex data access it may be less efficient. You do need the key classes (on cache NamedCache caller side) for being able to do operations on a partitioned cache, though.

    Best regards,

    Robert

  • RELEVANCY SCORE 2.89

    DB:2.89:Binary Serialization xz


     
    I have done the binary serialization of a class and stored in the database. Now I have moved this class into another class library(namespace remains the same), and while I get the database stored values and try to deserialize I am getting the following error.
     
    Unable to load type XXXXXXXXXXXXXX required for deserialization.
     
    Is the Binary serialized data hold the assemly information too? If not then my above scenario should work.
     
    Any help or suggestion would be appreciated.
     

    DB:2.89:Binary Serialization xz

    Your understanding is correct.
     
    Any change in the type information like namespace, exposed variables, assembly to which it is binded will faile to deserialize the persisted object

  • RELEVANCY SCORE 2.87

    DB:2.87:Both Type And Assembly Name Must Be Specified j1


    Hi All,

    I have 3rd party dll, I am adding the ref to it in my project.
    I have created a wrapper class for one of the 3rd part dll class, my wrapper class is inheriting for IPofSerializer and serilizing the object.

    In my pof-config.xml file I have created a new user-type for the class in the 3rd party dll.
    -------------------
    pof-config xmlns="http://schemas.tangosol.com/pof"
    user-type-list
    !-- include all "standard" Coherence POF user types --
    includeassembly://Coherence/Tangosol.Config/coherence-pof-config.xml/include

    !-- include all application POF user types --
    user-type
    type-id1002/type-id
    class-nameDCBOMLib.DCStockMarketClass/class-name
    /user-type
    /user-type-list
    /pof-config
    ------------

    While starting the Cache i am getting the error "Both type and assembly name must be specified", where so i find the assembly info for the type DCBOMLib.DCStockMarketClass(its a 3rd party dll)

    Regards,
    Akhil

    DB:2.87:Both Type And Assembly Name Must Be Specified j1

    You cannot serialize arbitrary usertypes via POF - they at least need to be IPortableObject.

    When you made the call to:
    outo.WriteObject(5, cinfo.DCSCM);The POF serializer will determine its type and will seek out a serializer for it. In this case, it is a user type but you do not have a IPortableObject "wrapper" for DCStockMarketClass.

    You really have several ways to do what you want, depending on your requirements:

    1. If DCStockMarketClass is language specifc and internal - you may not want to query/manipulate it from Coherence or other languages anyway. You may just want it to be a BLOB of data embedded in a POF type. In this case, you could simply serialize DCStockMarketClass using any available mechanism that is supported by POF. ie ByteStream, XML string, etc. You can then use WriteByteArray or WriteString. In this case, you dont need to register a new type.

    2. If you DO want to query/manipulate StockMarketClass properties from any other language via POF, you need to reflect its properties and serialize them individually. This can either be inline in ContactInfoSer, or a brand new type DCStockMarketClassWrapper (which derives off IPortableObject). If you do it inline - you dont need to register any new types. If you create a wrapper, you need to register the wrapper class.

  • RELEVANCY SCORE 2.87

    DB:2.87:Re: Error While Starting Coherence Server m7


    Hi,

    Your cache config is not the problem, your POF config is.

    This...
    Exception in thread "main" (Wrapped) (Wrapped: error configuring class "com.tangosol.io.pof.ConfigurablePofContext") java.lang.IllegalStateException: Duplicate included POF configuration (Config=Manager-pof-config.xml)Tells you what is wrong. It looks like you are trying to import the same POF config file twice.

    Posting you Cache configuration file is not going to explain the problem you need to post your POF config file - presumably the one called C:/Users/lakshmana/JPACoherenceWorkspace/Application/appClientModule/Manager-pof-config.xml

    In your cache configuration you have this...
    init-param
    param-typeString/param-type
    param-value system-property="Manager-pof-config.xml"/C:/Users/lakshmana/JPACoherenceWorkspace/Application/appClientModule/Manager-pof-config.xml/param-value
    /init-paramI don't think this is doing what you think it is. The system-property="Manager-pof-config.xml" part will look for a system property called Manager-pof-config.xml and if that property is set will use the value of that property to override your config file name.

    Hard coding the full path of a file into your cache config is not a good idea as it is very prone to breaking if the code moves to another location.

    JK

    DB:2.87:Re: Error While Starting Coherence Server m7

    Hi jon Manager.java is for implement the portable object , manager2.java is for to serialize and deserialize the object .........

    Jon my intention is to implement the portable object and so any platform can use my p_object .... is it right way to do that.... or any modifications required ...

    if Manager.java file is enough for to implement portable object , is there any format after deploying or is the same as class file...

    Thanks

  • RELEVANCY SCORE 2.86

    DB:2.86:Overriding Pof Serialization In Subclasses? pk


    When subclassing a class that support POF-serialization how would I know what property index I can pass in for the additional elements that are added by the subclass assuming that I do not have the source code of the superclass (or the information exists in the javadoc of the superclass)?

    Is it for instance possible to read the last property index used from the stream (PofWriter) somehow?

    I tried to find anything in the PofContext and PofWriter interface but did not see anything that looked relevant...

    Examples of situations were this would apply is for instance when creating a custom filter using a Coherence built in filter as base-class.

    Or am I missing something obvious about how to handle subclassing and POF-serialization that makes this a non-issue?

    Best Regards
    Magnus

    DB:2.86:Overriding Pof Serialization In Subclasses? pk

    Good point!

    Using a very high number as proposed later in this thread is indeed a viable work-around until this is properly documented.

    Best Regards
    Magnus

  • RELEVANCY SCORE 2.85

    DB:2.85:What Is Pof In Oracle Coherence? z1


    What is POF in Oracle Coherence?

    Thank you

    DB:2.85:What Is Pof In Oracle Coherence? z1

    The POF format and a lot of information on how to use POF with Coherence is detailed in the upcoming Coherence 3.5 documentation.

    Peace,

    Cameron Purdy | Oracle

  • RELEVANCY SCORE 2.85

    DB:2.85:Streamcorruptedexception Error xm


    I am getting the following wierd exception, java.io.StreamCorruptedException any ideas whats wrong. Things were working fine before.

    2011-10-20 12:04:40.741/1.418 Oracle Coherence GE 3.6.0.4 D5 (thread=Invocation:Management, member=10): Service Management joined the cluster with senior service member 1
    2011-10-20 12:04:40.974/1.651 Oracle Coherence GE 3.6.0.4 Info (thread=DistributedCache, member=10): Loaded POF configuration from "jar:file:/cgbu/home4/anasthan/Oracle/Middleware/coherence_3.6/lib/coherence.jar!/tokens-pof-config.xml"
    2011-10-20 12:04:40.986/1.663 Oracle Coherence GE 3.6.0.4 Info (thread=DistributedCache, member=10): Loaded included POF configuration from "jar:file:/cgbu/home4/anasthan/Oracle/Middleware/coherence_3.6/lib/coherence.jar!/coherence-pof-config.xml"
    *2011-10-20 12:04:41.059/1.736 Oracle Coherence GE 3.6.0.4 D5 (thread=DistributedCache, member=10): Service DistributedCache joined the cluster with senior service member 2*
    *2011-10-20 12:04:41.073/1.750 Oracle Coherence GE 3.6.0.4 Error (thread=DistributedCache, member=10): The service "DistributedCache" is configured to use serializer com.tangosol.io.pof.ConfigurablePofContext {location=tokens-pof-config.xml}, which appears to be different from the serializer used by Member(Id=2, Timestamp=2011-10-20 02:09:38.765, Address=10.241.32.51:8090, MachineId=56115, Location=site:us.oracle.com,machine:cgbuperfmlin3,process:4936, Role=WeblogicServer).*
    java.io.StreamCorruptedException: unknown user type: 6 at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3302)
    at com.tangosol.io.pof.PofBufferReader.readObject(PofBufferReader.java:2603)
    at com.tangosol.io.pof.ConfigurablePofContext.deserialize(ConfigurablePofContext.java:358)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.readObject(Service.CDB:1)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid$MemberWelcome.read(Grid.CDB:17)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService$MemberWelcome.read(PartitionedService.CDB:6)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.deserializeMessage(Grid.CDB:42)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:31)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onNotify(PartitionedService.CDB:3)
    at java.lang.Thread.run(Thread.java:619)
    Stopping the DistributedCache service.

    The following is my disctributedCaching scheme, I have a POF configured.
    =================================================
    distributed-scheme
    scheme-nameexample-distributed/scheme-name
    service-nameDistributedCache/service-name
    serializer
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    init-params
    init-param
    param-typestring/param-type
    param-valuetokens-pof-config.xml/param-value
    /init-param
    /init-params
    /serializer
    backing-map-scheme
    local-scheme
    scheme-refexample-binary-backing-map/scheme-ref
    /local-scheme
    /backing-map-scheme
    autostarttrue/autostart
    /distributed-scheme

    tokens-pof-config.xml
    *=============*
    ?xml version="1.0"?
    pof-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns="http://xmlns.oracle.com/coherence/coherence-pof-config"
    xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-pof-config coherence-pof-config.xsd"
    user-type-list
    !-- coherence POF user types --
    includecoherence-pof-config.xml/include
    !-- com.tangosol.examples package --
    user-type
    type-id1001/type-id
    class-nameToken/class-name
    /user-type
    user-type
    type-id1002/type-id
    class-nameAsdl/class-name
    /user-type
    /user-type-list
    allow-interfacestrue/allow-interfaces
    allow-subclassestrue/allow-subclasses
    /pof-config

    ~
    ~

    DB:2.85:Streamcorruptedexception Error xm

    I solved it , thanks for your help JK, you rule :).

    Ankit

  • RELEVANCY SCORE 2.84

    DB:2.84:Missing Pofserializer Configuration ad


    Any ideas what I may have missed when converting to pof?
    2011-07-18 18:23:23.006/12.178 Oracle Coherence GE 3.7.0.2 Error (thread=main, member=1): Error while starting service "DistributedQuotesCacheService": (Wrapped) (Wrapped: error creating class "com.tangosol.io.pof.ConfigurablePofContext") java.lang.IllegalStateException: Missing PofSerializer configuration (Config=z:\coherence\pof-config.xml, Type-Id=10000, Class-Name=dj_quotes.DJQuote)
    at com.tangosol.coherence.component.util.Daemon.start(Daemon.CDB:52)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.start(Service.CDB:7)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.start(Grid.CDB:6)
    at com.tangosol.coherence.component.util.SafeService.startService(SafeService.CDB:39)
    at com.tangosol.coherence.component.util.safeService.SafeCacheService.startService(SafeCacheService.CDB:5)
    at com.tangosol.coherence.component.util.SafeService.ensureRunningService(SafeService.CDB:27)
    at com.tangosol.coherence.component.util.SafeService.start(SafeService.CDB:14)
    at com.tangosol.net.DefaultConfigurableCacheFactory.ensureServiceInternal(DefaultConfigurableCacheFactory.java:1102)
    at com.tangosol.net.DefaultConfigurableCacheFactory.ensureService(DefaultConfigurableCacheFactory.java:934)
    at com.tangosol.net.DefaultCacheServer.startServices(DefaultCacheServer.java:81)
    at com.tangosol.net.DefaultCacheServer.intialStartServices(DefaultCacheServer.java:250)
    at com.tangosol.net.DefaultCacheServer.startAndMonitor(DefaultCacheServer.java:55)
    at com.tangosol.net.DefaultCacheServer.main(DefaultCacheServer.java:197)
    Caused by: (Wrapped: error creating class "com.tangosol.io.pof.ConfigurablePofContext") java.lang.IllegalStateException: Missing PofSerializer configuration (Config=z:\coherence\pof-config.xml, Type-Id=10000, Class-Name=dj_quotes.DJQuote)
    at com.tangosol.io.ConfigurableSerializerFactory.createSerializer(ConfigurableSerializerFactory.java:46)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.instantiateSerializer(Service.CDB:1)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.ensureSerializer(Service.CDB:32)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.ensureSerializer(Service.CDB:4)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onEnter(Grid.CDB:26)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onEnter(PartitionedService.CDB:19)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:14)
    at java.lang.Thread.run(Thread.java:662)
    Caused by: java.lang.IllegalStateException: Missing PofSerializer configuration (Config=z:\coherence\pof-config.xml, Type-Id=10000, Class-Name=dj_quotes.DJQuote)
    at com.tangosol.io.pof.ConfigurablePofContext.report(ConfigurablePofContext.java:1254)
    at com.tangosol.io.pof.ConfigurablePofContext.createPofConfig(ConfigurablePofContext.java:989)
    at com.tangosol.io.pof.ConfigurablePofContext.initialize(ConfigurablePofContext.java:775)
    at com.tangosol.io.pof.ConfigurablePofContext.setContextClassLoader(ConfigurablePofContext.java:319)
    at com.tangosol.io.ConfigurableSerializerFactory.createSerializer(ConfigurableSerializerFactory.java:42)
    ... 7 moreThanks,
    Andrew

    DB:2.84:Missing Pofserializer Configuration ad

    Looks like drowland was on the right track. I think a build issue was causing an old non-POF class to get deployed in the new POF jar. Thanks...

    Andrew

  • RELEVANCY SCORE 2.84

    DB:2.84:Why We Need Java Class In C++ Pof Serialization ap


    Hi,
    I'm really confused why we need java class which implements PortableObject to support complex objects of c++. If we are not using any queries or entry processors in the application can't we keep the object as serialized byte format and can't we retrieve in from the c++ deserialization.

    Please share your thoughts if there's a way if we can skip any Java implementation.

    regards,
    Sura

    DB:2.84:Why We Need Java Class In C++ Pof Serialization ap

    Hi,
    I'm really confused why we need java class which implements PortableObject to support complex objects of c++. If we are not using any queries or entry processors in the application can't we keep the object as serialized byte format and can't we retrieve in from the c++ deserialization.

    Please share your thoughts if there's a way if we can skip any Java implementation.

    regards,
    Sura

  • RELEVANCY SCORE 2.84

    DB:2.84:Error While Pof Serialization 78


    Hi i am getting the following error while doing the POF serialization. please help

    java.lang.StackOverflowError
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)

    DB:2.84:Error While Pof Serialization 78

    Hi,

    I am getting the similar exception when using coherence 3.7.1.

    Unfortunately I cannot open the screencast from my network.

    Thanks in advance.

    Durga Prasad

  • RELEVANCY SCORE 2.83

    DB:2.83:Any Advantages Of Serialization pp


    Hi ,

    However , we will do Object State persistance by using ObjectStreams also, as like Serialization.

    But in Serialization it writes SerialVersionID in File and this can be used to check the correctness of data,while reading of the file.

    Except this point, we have any more advantages of Serialization.

    Cheers.

    DB:2.83:Any Advantages Of Serialization pp

    Volcano wrote:
    However , we will do Object State persistance by using ObjectStreams also, as like Serialization.Lots of disadvantages of course. I certainly don't want my credit card number handled via standard java serialization.

  • RELEVANCY SCORE 2.79

    DB:2.79:Pof Serialization Of A Collection xp


    I am trying to pof serialize an std::vector of objects and send it via extend to cache where I should be able to retrieve my vector in a form of java collection. What is the best way to do that. Is Coherence C++ STL friendly!?

    Edited by: dadashy on Oct 15, 2010 1:39 AM

    DB:2.79:Pof Serialization Of A Collection xp

    Hi Dadashy,

    Coherence C++ does not include direct support for POF serializing a std::vectorT, but it is possible to achieve what you want without too much effort. The fist thing you'll need to do is ensure that there are POF serializers for the vector's T element type, before you worry about serializing a "collection" of Ts, see http://download.oracle.com/docs/cd/E15357_01/coh.360/e15726/cpp_integrationobj.htm#BAJJEDIE for details on how to do this first step.

    Once you've made your Ts POF serializable, you may realize that you could choose to apply the same technique to make std::vectorT POF serializable. While this approach would work, it would not achieve your desired effect of materializing as a Java collection on the other side. To do that you have a few options. By far the simplest option is to iterate over your std::vector and copy the elements into one of the various collection implementations which ship with Coherence C++, the coherence::util::ArrayList aka CircularArrayList would be a good candidate. A more complex but potentially more performant alternative would be to write a std::vector wrapper which implements the coherence::util::Collection or coherence::util::List interface. The later approach is certainly more complex, but if your T elements extend coherence::lang::Object or use coherence::lang::Managed, then it offers some performance benefits over simply copying the content into an ArrayList.

    With either approach you will end up with an object which implements the coherence::util::Collection interface, and this can be written to the POF stream via PofWrite::writeCollection, and will materialize on the Java side as a java.util.Collection. Obviously the process needs to be reversed during deserialization, but both of the above techniques will work fine for deserialization as well.

    Mark
    Oracle Coherence

  • RELEVANCY SCORE 2.78

    DB:2.78:Pof Serialization Of Biginteger 8a


    The doc for PofWriter states that WriteBigInteger throws an IllegalStateException if the BigInteger is 128 bits, and indeed it does in practice. I have some objects contianing BigIntegers which are learger than this. I can get round the problem for my object (serialize and deserialize BigInteger as byte[]) but in this case the key is also a BigInteger - which may be 128 bits.

    I tried to add a serializer for BigInteger to the pof config, but it doesn't seem to override the default behavour - is this possible?

    Obviously I can write some class to wrap this BigInteger key (though thats a bit of a drag) but I really think that Coherence should not refuse to serialize a perfectly valid java object?

    DB:2.78:Pof Serialization Of Biginteger 8a

    Alasdair wrote:
    The doc for PofWriter states that WriteBigInteger throws an IllegalStateException if the BigInteger is 128 bits, and indeed it does in practice. I have some objects contianing BigIntegers which are learger than this. I can get round the problem for my object (serialize and deserialize BigInteger as byte[]) but in this case the key is also a BigInteger - which may be 128 bits.

    I tried to add a serializer for BigInteger to the pof config, but it doesn't seem to override the default behavour - is this possible?

    Obviously I can write some class to wrap this BigInteger key (though thats a bit of a drag) but I really think that Coherence should not refuse to serialize a perfectly valid java object?Hi Alasdair,

    BigInteger and BigDecimal are a bit unfairly treated if you look at it from the Java side. On the other hand, POF format is not Java oriented, it is a platform-independent specification implemented among others in Java, which dictates not Java BigInteger/BigDecimal objects, rather dictates 128-bit integer/floating point values, which happen to be represented in Java as BigInteger/BigDecimal objects. I agree that the method names writeBigDecimal and writeBigInteger is misleading.

    On the other hand, it should definitely be possible to register a custom PofSerializer for BigInteger and BigDecimal and that is captured as feature request COH-5308. I don't know when it will be released and what versions will be patched with it. Coding-wise it is a fairly simple change, on the other hand it may have some performance implications for serialization (not for deserialization), so it may be delayed a bit.

    Look for COH-5308 in patch release notes and/or try to ask your Oracle contact to find out when it is scheduled for releasing.

    Best regards,

    Robert

  • RELEVANCY SCORE 2.78

    DB:2.78:Can I Use Java Serialization As Well As Pof Serialization In Same Class 9x


    Hi ,

    Can i use the java serialization and pof serialization in one class. ?

    And

    In my application, a object is already serialized using normal java serialization and stored in DB, i want to de serialize using pof serialization, how can i do that ?

    And

    How can i implement the auto pof serialization ? if possible means how can i implement it in the below scenario.

    Scenario :

    i have 2 class (Class A,Class B) both are using normal java serialization process, Class B object is added in the Class A Collection variable like list. if i implement the auto pof serialization in class A. whether it will serialize the both class?

    please help. i am using coherence 3.7 and weblogic 10.3.5

    DB:2.78:Can I Use Java Serialization As Well As Pof Serialization In Same Class 9x

    Hi,

    Can i use the java serialization and pof serialization in one class. ?This is possible however would require a custom serializer. Assuming you want a POF aware object to serialize a java.io.Serializable then you could either within a customer PofSerializer or in your PortableObject implementation manually serialize your object to a byte[]. Once you have a byte[] you can call either PofWriter.writeByteArray or PofWriter.writeBinary (if you wish to use our Binary wrapper). Upon deserialization you would perform the inverse, i.e. PofReader.readByteArray and manually convert the returned byte[] to its Object form. Alternative to the "manual" parts you could use ExternalizableHelper.toBinary and ExternalizableHelper.fromBinary providing appropriate serializers.

    In my application, a object is already serialized using normal java serialization and stored in DB, i want to de serialize using pof serialization, how can i do that ?This will be a two step process: deserialize using java serialization into object form followed by serializing using POF.

    How can i implement the auto pof serialization ? if possible means how can i implement it in the below scenario.Recently we have added support for annotations hence you can annotate your methods, fields or properties (.NET) opposed to implementing serialization routines, here is the youtube [url http://www.youtube.com/watch?v=hapBGoS-viIfeature=youtube_gdata]screencast. That is as "auto" as we go (OOTB) at the moment. There are a few people that have done work in either generating serialization code or reflection based serialization, which as far as I know.

    i have 2 class (Class A,Class B) both are using normal java serialization process, Class B object is added in the Class A Collection variable like list. if i implement the auto pof serialization in class A. whether it will serialize the both class? Using POF Annotations you must annotate both classes.

    Thanks,
    Harvey

  • RELEVANCY SCORE 2.77

    DB:2.77:Using Pof In C++ Client xk


    I never used C++ extend client before.

    One of our developer going to use C++ extend client and like to use POF serialization.
    I told him that I need a matching Java class in order to store the data in storage node but he told me that we don't need to if using POF.

    Is he right??

    DB:2.77:Using Pof In C++ Client xk

    This thread title is getting misleading, but what I am about to write applies to both C++ and .Net.

    When using the PofExtractor, the message sent to the backend include the PofExtractor and it's context. So it is not the data that makes the trip back and forth between the nodes, but it is the command. The PofExtractor gets executed on the server side.

    How is that possible? The PofExtractor is not a customer class, but a coherence class. As such, even if you don't provide a Java class for it, the cache already has it...

    So, the filtering occurs on the server side. The entries returned to the client side are the filtered entries.

    HTH

    Serge

  • RELEVANCY SCORE 2.77

    DB:2.77:Serialization Error While Accessing An Hr Application zx



    Hi All,

    We have a producer portal where we have configured all the HR ESS / MSS applications and a producer portal which consumes thru which the end users access the application.

    From logs of our producer portal, we found the error given below. I tried searching this error message in SDN, but found no relevant explanation for the error. Kindly let me know the reason for this error and the resolution.

    Error:

    Serialization failed for object: pcd:portal_content/com.XX.fl_XX_addl_content/com.XX.fl_hr/com.XX.fl_ess/com.XX.r_proxy_main/fl_ecm_employee_services/fl_ecm_manager_services/fl_ecm_proxy_line_mgr/com.XX.i_overview. Exception: com.sap.portal.ivs.global.jndibridge.serializers.SerializationFailedException

    Regards,

    Poojith MV

    DB:2.77:Serialization Error While Accessing An Hr Application zx


    Hi All,

    We have a producer portal where we have configured all the HR ESS / MSS applications and a producer portal which consumes thru which the end users access the application.

    From logs of our producer portal, we found the error given below. I tried searching this error message in SDN, but found no relevant explanation for the error. Kindly let me know the reason for this error and the resolution.

    Error:

    Serialization failed for object: pcd:portal_content/com.XX.fl_XX_addl_content/com.XX.fl_hr/com.XX.fl_ess/com.XX.r_proxy_main/fl_ecm_employee_services/fl_ecm_manager_services/fl_ecm_proxy_line_mgr/com.XX.i_overview. Exception: com.sap.portal.ivs.global.jndibridge.serializers.SerializationFailedException

    Regards,

    Poojith MV

  • RELEVANCY SCORE 2.76

    DB:2.76:Pof Serialization Language Interoperability (Java - .Net) jc


    We have been seeing an issue with serializing and deserializing Pof objects between languages.

    Without references-enabled, we can serialize and deserialize in both directions

    With references-enabled true, we can serialise in .NET and deserialize in Java, but not the other way round. This means that objects serialized in .Net and updated with an entry processor can not be deserialized back in .Net.

    Luckily we dont need to have references-enabled at the moment, but this may change in the future..

    Has anyone seen this issue before? is it a known bug?

    DB:2.76:Pof Serialization Language Interoperability (Java - .Net) jc

    Hi Rob,

    That will be great.

    Thanks,
    Luk

    Edited by: lsho on Jun 26, 2012 7:57 AM

  • RELEVANCY SCORE 2.76

    DB:2.76:Pof Serialization Testing c7


    hi

    I would like to write an integration test for POF serialization.

    Is it possible to make Coherence to serialize all objects I put into named cache ?
    Or I need to start two nodes (in two classloaders) in my test ?

    DB:2.76:Pof Serialization Testing c7

    If your test case only want to test the POF serialization, you don't really need to start up a cluster/node at all.

    Just create a ConfigurablePofContext with pof config xml then you can perform POF serialization/deserialization with it.

  • RELEVANCY SCORE 2.74

    DB:2.74:Oracle Soa Suite 11g And Push Replication Memory Leak Problem Between Soa Clusters jz


    Hi all,I wonder if someone could help. I'm relatively new to Coherence Incubator so this might be a simple configuration problem. We are trying to build Incubator Push Replication solution with Active-Active topology to enable data replication between two independent SOA Suite 11g clusters. Basicly we have 2 SOA clusters which each have 2 SOA servers (4 all together). I have setup also Coherence Incubator Cache-servers to same nodes where these SOA servers are running and we use Extend Client from each SOA server side to put/get our custom data to/from the Cache-servers. This is the same solution which is described in the Oracle support article "Coherence and SOA Suite Integration Recommendations" (Doc ID 1557370.1 - https://support.oracle.com/epmos/faces/DocumentDisplay?id=1557370.1). Details described in the chapter "Extending the existing SOA cluster".Idea is that each of our 4 SOA servers can update the cache by updating data on a single cache-server (with Coherence Incubator libraries) and the cache-server will then update the changes (with Push Replication) to other 3 cache-servers. This way all 4 cache servers have the latest changes which SOA servers can then use. I did get this get this solution working and have verified that replication of data to each cache server is working. Also our SOA application on each SOA server is able to get the data from cache servers without delays.Problem appeared when I left the cache-servers running for while as pretty soon all of them encountered OutOfMemory-situation and were shutdown. After examining cache-server heap dumps with Eclipse Memory Analyser Tool (MAT) it seemed that there was a single "good" cache object in the heap and several others with the almost same byte size with the exception that all of them seemed to have the cache key in front of the data. If almost seemed like these objects were objects to be replicated/queued for replication or something like that. I believe these objects are the reason why the OOM-situation happens but don't know if this is configuration issue or something else.SW details:SOA servers: - Weblogic server 10.3.6.0 - SOA Suite 11.1.1.6.0 - Coherence 3.7.1.12 - Coherence Incubator 11.2.1 libraries (needed for POF serialization)Cache-servers: - Coherence 3.7.1.12 - Coherence Incubator 11.2.1 libraries (needed for POF serialization)I'm suspecting that this error could be related to this:2014-05-09 14:33:41.506/18649.734 Oracle Coherence GE 3.7.1.12 Warning (thread=EventChannelController:Thread-27, member=1): Failed while attempting to start EventChannelController.Identifier{symbolicName=test-node4 CACHESTORE_SOA, externalName=Site3:test_cache:CACHESTORE_SOA:test-node4 CACHESTORE_SOA} Class:com.oracle.coherence.patterns.eventdistribution.distributors.AbstractEventChannelController Method:onStart2014-05-09 14:33:41.506/18649.734 Oracle Coherence GE 3.7.1.12 Info (thread=EventChannelController:Thread-27, member=1): EventChannel Exception was as follows Class:com.oracle.coherence.patterns.eventdistribution.distributors.AbstractEventChannelController Method:onStartjava.lang.IllegalArgumentException: Missing scheme for service: "test-node4" at com.tangosol.net.DefaultConfigurableCacheFactory.findServiceScheme(DefaultConfigurableCacheFactory.java:767) at com.tangosol.net.DefaultConfigurableCacheFactory.ensureService(DefaultConfigurableCacheFactory.java:337) at com.oracle.coherence.common.resourcing.InvocationServiceSupervisedResourceProvider.ensureResource(InvocationServiceSupervisedResourceProvider.java:65)This error comes quite often to each cache-server log and the thing is that it seems that the cache server refers to itself (error comes from test-node4 cache-server so seems like test-node4 is trying to replicate cache data changes to itself)? This also makes me think that there might be something wrong with the configuration, but cache-server config (test-node4) for cache "CACHESTORE_SOA" does not define distribution channel to itself. cache-mapping cache-nameCACHESTORE_SOA/cache-name scheme-nameCACHESTORE_SOA-cachestore/scheme-name event:distributor event:distributor-name{cache-name}/event:distributor-name event:distributor-external-name{site-name}-{cluster-name}-{cache-name}/event:distributor-external-name event:distributor-scheme event:coherence-based-distributor-scheme/ /event:distributor-scheme event:distribution-channels event:distribution-channel event:channel-nametest-node1 CACHESTORE_SOA/event:channel-name event:starting-mode system-property="channel.starting.mode"enabled/event:starting-mode event:channel-scheme event:remote-cluster-channel-scheme event:remote-invocation-service-nametest-node1/event:remote-invocation-service-name event:remote-channel-scheme event:local-cache-channel-scheme event:target-cache-nameCACHESTORE_SOA/event:target-cache-name /event:local-cache-channel-scheme /event:remote-channel-scheme /event:remote-cluster-channel-scheme /event:channel-scheme /event:distribution-channel event:distribution-channel event:channel-nametest-node2 CACHESTORE_SOA/event:channel-name event:starting-mode system-property="channel.starting.mode"enabled/event:starting-mode event:channel-scheme event:remote-cluster-channel-scheme event:remote-invocation-service-nametest-node2/event:remote-invocation-service-name event:remote-channel-scheme event:local-cache-channel-scheme event:target-cache-nameCACHESTORE_SOA/event:target-cache-name /event:local-cache-channel-scheme /event:remote-channel-scheme /event:remote-cluster-channel-scheme /event:channel-scheme /event:distribution-channel event:distribution-channel event:channel-nametest-node3 CACHESTORE_SOA/event:channel-name event:starting-mode system-property="channel.starting.mode"enabled/event:starting-mode event:channel-scheme event:remote-cluster-channel-scheme event:remote-invocation-service-nametest-node3/event:remote-invocation-service-name event:remote-channel-scheme event:local-cache-channel-scheme event:target-cache-nameCACHESTORE_SOA/event:target-cache-name /event:local-cache-channel-scheme /event:remote-channel-scheme /event:remote-cluster-channel-scheme /event:channel-scheme /event:distribution-channel /event:distribution-channels /event:distributor /cache-mappingPlease let me know if anyone has any ideas what could be the reason for the OOM-situation.

    DB:2.74:Oracle Soa Suite 11g And Push Replication Memory Leak Problem Between Soa Clusters jz

    Hi all,I wonder if someone could help. I'm relatively new to Coherence Incubator so this might be a simple configuration problem. We are trying to build Incubator Push Replication solution with Active-Active topology to enable data replication between two independent SOA Suite 11g clusters. Basicly we have 2 SOA clusters which each have 2 SOA servers (4 all together). I have setup also Coherence Incubator Cache-servers to same nodes where these SOA servers are running and we use Extend Client from each SOA server side to put/get our custom data to/from the Cache-servers. This is the same solution which is described in the Oracle support article "Coherence and SOA Suite Integration Recommendations" (Doc ID 1557370.1 - https://support.oracle.com/epmos/faces/DocumentDisplay?id=1557370.1). Details described in the chapter "Extending the existing SOA cluster".Idea is that each of our 4 SOA servers can update the cache by updating data on a single cache-server (with Coherence Incubator libraries) and the cache-server will then update the changes (with Push Replication) to other 3 cache-servers. This way all 4 cache servers have the latest changes which SOA servers can then use. I did get this get this solution working and have verified that replication of data to each cache server is working. Also our SOA application on each SOA server is able to get the data from cache servers without delays.Problem appeared when I left the cache-servers running for while as pretty soon all of them encountered OutOfMemory-situation and were shutdown. After examining cache-server heap dumps with Eclipse Memory Analyser Tool (MAT) it seemed that there was a single "good" cache object in the heap and several others with the almost same byte size with the exception that all of them seemed to have the cache key in front of the data. If almost seemed like these objects were objects to be replicated/queued for replication or something like that. I believe these objects are the reason why the OOM-situation happens but don't know if this is configuration issue or something else.SW details:SOA servers: - Weblogic server 10.3.6.0 - SOA Suite 11.1.1.6.0 - Coherence 3.7.1.12 - Coherence Incubator 11.2.1 libraries (needed for POF serialization)Cache-servers: - Coherence 3.7.1.12 - Coherence Incubator 11.2.1 libraries (needed for POF serialization)I'm suspecting that this error could be related to this:2014-05-09 14:33:41.506/18649.734 Oracle Coherence GE 3.7.1.12 Warning (thread=EventChannelController:Thread-27, member=1): Failed while attempting to start EventChannelController.Identifier{symbolicName=test-node4 CACHESTORE_SOA, externalName=Site3:test_cache:CACHESTORE_SOA:test-node4 CACHESTORE_SOA} Class:com.oracle.coherence.patterns.eventdistribution.distributors.AbstractEventChannelController Method:onStart2014-05-09 14:33:41.506/18649.734 Oracle Coherence GE 3.7.1.12 Info (thread=EventChannelController:Thread-27, member=1): EventChannel Exception was as follows Class:com.oracle.coherence.patterns.eventdistribution.distributors.AbstractEventChannelController Method:onStartjava.lang.IllegalArgumentException: Missing scheme for service: "test-node4" at com.tangosol.net.DefaultConfigurableCacheFactory.findServiceScheme(DefaultConfigurableCacheFactory.java:767) at com.tangosol.net.DefaultConfigurableCacheFactory.ensureService(DefaultConfigurableCacheFactory.java:337) at com.oracle.coherence.common.resourcing.InvocationServiceSupervisedResourceProvider.ensureResource(InvocationServiceSupervisedResourceProvider.java:65)This error comes quite often to each cache-server log and the thing is that it seems that the cache server refers to itself (error comes from test-node4 cache-server so seems like test-node4 is trying to replicate cache data changes to itself)? This also makes me think that there might be something wrong with the configuration, but cache-server config (test-node4) for cache "CACHESTORE_SOA" does not define distribution channel to itself. cache-mapping cache-nameCACHESTORE_SOA/cache-name scheme-nameCACHESTORE_SOA-cachestore/scheme-name event:distributor event:distributor-name{cache-name}/event:distributor-name event:distributor-external-name{site-name}-{cluster-name}-{cache-name}/event:distributor-external-name event:distributor-scheme event:coherence-based-distributor-scheme/ /event:distributor-scheme event:distribution-channels event:distribution-channel event:channel-nametest-node1 CACHESTORE_SOA/event:channel-name event:starting-mode system-property="channel.starting.mode"enabled/event:starting-mode event:channel-scheme event:remote-cluster-channel-scheme event:remote-invocation-service-nametest-node1/event:remote-invocation-service-name event:remote-channel-scheme event:local-cache-channel-scheme event:target-cache-nameCACHESTORE_SOA/event:target-cache-name /event:local-cache-channel-scheme /event:remote-channel-scheme /event:remote-cluster-channel-scheme /event:channel-scheme /event:distribution-channel event:distribution-channel event:channel-nametest-node2 CACHESTORE_SOA/event:channel-name event:starting-mode system-property="channel.starting.mode"enabled/event:starting-mode event:channel-scheme event:remote-cluster-channel-scheme event:remote-invocation-service-nametest-node2/event:remote-invocation-service-name event:remote-channel-scheme event:local-cache-channel-scheme event:target-cache-nameCACHESTORE_SOA/event:target-cache-name /event:local-cache-channel-scheme /event:remote-channel-scheme /event:remote-cluster-channel-scheme /event:channel-scheme /event:distribution-channel event:distribution-channel event:channel-nametest-node3 CACHESTORE_SOA/event:channel-name event:starting-mode system-property="channel.starting.mode"enabled/event:starting-mode event:channel-scheme event:remote-cluster-channel-scheme event:remote-invocation-service-nametest-node3/event:remote-invocation-service-name event:remote-channel-scheme event:local-cache-channel-scheme event:target-cache-nameCACHESTORE_SOA/event:target-cache-name /event:local-cache-channel-scheme /event:remote-channel-scheme /event:remote-cluster-channel-scheme /event:channel-scheme /event:distribution-channel /event:distribution-channels /event:distributor /cache-mappingPlease let me know if anyone has any ideas what could be the reason for the OOM-situation.

  • RELEVANCY SCORE 2.74

    DB:2.74:Setting Up Gzip Compression To Optimize Text Throughtput fk


    Hello,

    I struggled to setup my dev cluster with gzip but it's running okay. However I cannot connect to it from .Net client. The server complains with Not in GZIP format

    What do I miss?

    Thanks,
    Michal

    BTW: The for doc .Net is not very great (since when one put's documentation in a dtd file?!? see [Network Filters |http://coherence.oracle.com/display/COH35UG/Network+Filters#]... ).

    On client I have

    app.config

    coherence
    coherence-configConfig\coherence.xml/coherence-config
    cache-configConfig\cache-config.xml/cache-config
    pof-configConfig\pof-config.xml/pof-config
    /coherence

    cache-config.xls

    cache-config xmlns="http://schemas.tangosol.com/cache"
    caching-scheme-mapping
    cache-mapping
    cache-namedist-contact-cache/cache-name
    scheme-nameextend-direct/scheme-name
    /cache-mapping
    /caching-scheme-mapping
    caching-schemes
    remote-cache-scheme
    scheme-nameextend-direct/scheme-name
    service-nameExtendTcpCacheService/service-name
    initiator-config
    tcp-initiator
    remote-addresses
    socket-address
    address[name of my machine]/address
    port9099/port
    /socket-address
    /remote-addresses
    /tcp-initiator
    outgoing-message-handler
    request-timeout30s/request-timeout
    /outgoing-message-handler
    *use-filtersfilter-namegzip/filter-name/use-filters*
    /initiator-config
    /remote-cache-scheme
    /caching-schemes
    /cache-config

    msg from the client:

    Oracle Coherence for .NET Version 3.7.0.0 Build 23256
    RTC Release Build
    Copyright (c) 2000, 2011, Oracle and/or its affiliates. All rights reserved.

    2011-08-19 13:39:26.714 D5 (thread=System.Threading.Thread): Loaded operational configuration from "FileResource(Uri = file://Config\coherence.xml, AbsolutePath = O:\bin\Debug\Config\coherence.xml)"
    2011-08-19 13:39:26.714 D5 (thread=System.Threading.Thread): Loaded cache configuration from "FileResource(Uri = file://Config\cache-config.xml, AbsolutePath = O:\bin\Debug\Config\cache-config.xml)"
    2011-08-19 13:39:26.855 D5 (thread=ExtendTcpCacheService:TcpInitiator): Started: TcpInitiator{Name=ExtendTcpCacheService:TcpInitiator, State=(Started), Codec=Tangosol.Net.Messaging.Impl.Codec, PingInterval=0, PingTimeout=30000, RequestTimeout=30000, ConnectTimeout=30000, RemoteAddresses=[10.166.110.67:9099], KeepAliveEnabled=True, TcpDelayEnabled=False, ReceiveBufferSize=0, SendBufferSize=0, LingerTimeout=-1}
    2011-08-19 13:39:26.870 D5 (thread=System.Threading.Thread): Connecting Socket to 10.166.110.67:9099
    2011-08-19 13:39:26.870 Info (thread=System.Threading.Thread): Connected TcpClient to 10.166.110.67:9099
    2011-08-19 13:39:26.917 D5 (thread=ExtendTcpCacheService:TcpInitiator): Loaded POF configuration from "FileResource(Uri = file://Config\pof-config.xml, AbsolutePath = O:\bin\Debug\Config\pof-config.xml)"
    2011-08-19 13:39:26.933 D5 (thread=ExtendTcpCacheService:TcpInitiator): Loaded included POF configuration from "EmbeddedResource(Uri = assembly://Coherence/Tangosol.Config/coherence-pof-config.xml, AbsolutePath = assembly://Coherence/Tangosol.Config/coherence-pof-config.xml)"
    2011-08-19 13:39:27.011 Info (thread=System.Threading.Thread): Error establishing a connection with 10.166.110.67:9099: Tangosol.Net.Messaging.ConnectionException: TcpConnection(Id=, Open=True, LocalAddress=0.0.0.0:3551, RemoteAddress=10.166.110.67:9099)
    2011-08-19 13:39:27.011 Error (thread=System.Threading.Thread): Error while starting service "ExtendTcpCacheService": Tangosol.Net.Messaging.ConnectionException: could not establish a connection to one of the following addresses: [10.166.110.67:9099]; make sure the "remote-addresses" configuration element contains an address and port of a running TcpAcceptor
    2011-08-19 13:39:27.026 D5 (thread=ExtendTcpCacheService:TcpInitiator): Stopped: TcpInitiator{Name=ExtendTcpCacheService:TcpInitiator, State=(Stopped), Codec=Tangosol.Net.Messaging.Impl.Codec, PingInterval=0, PingTimeout=30000, RequestTimeout=30000, ConnectTimeout=30000, RemoteAddresses=[10.166.110.67:9099], KeepAliveEnabled=True, TcpDelayEnabled=False, ReceiveBufferSize=0, SendBufferSize=0, LingerTimeout=-1}

    Unhandled Exception: Tangosol.Net.Messaging.ConnectionException: could not establish a connection to one of the following addresses: [10.166.110.67:9099]; make sure the "remote-addresses" configuration element contains an address and port of a running TcpAcceptor

    My server config is

    config\tangosol-coherence-override.xml

    ?xml version="1.0"?
    !DOCTYPE coherence SYSTEM "coherence.dtd"
    coherence
    cluster-config
    *filter*
    *filter-namegzip/filter-name*
    *filter-classcom.tangosol.net.CompressionFilter/filter-class*
    *init-params*
    *init-param*
    *param-namestrategy/param-name*
    *param-valuegzip/param-value*
    */init-param*
    *init-param*
    *param-namelevel/param-name*
    *param-valuespeed/param-value*
    */init-param*
    */init-params*
    */filter*
    /cluster-config
    /coherence

    and config/contact-cache-config.xml

    cache-config
    defaults
    serializerpof/serializer
    /defaults

    caching-scheme-mapping
    cache-mapping
    cache-namedist-*/cache-name
    scheme-namedist-default/scheme-name
    /cache-mapping
    cache-mapping
    cache-namerepl-*/cache-name
    scheme-namerepl-default/scheme-name
    /cache-mapping
    cache-mapping
    cache-nameaspnet-session-storage/cache-name
    scheme-nameaspnet-session-scheme/scheme-name
    /cache-mapping
    cache-mapping
    cache-nameaspnet-session-overflow/cache-name
    scheme-nameaspnet-session-overflow-scheme/scheme-name
    /cache-mapping
    /caching-scheme-mapping

    caching-schemes
    distributed-scheme
    scheme-nameaspnet-session-scheme/scheme-name
    scheme-refdist-default/scheme-ref
    service-nameAspNetSessionCache/service-name

    backing-map-scheme
    local-scheme
    class-namecom.tangosol.net.cache.LocalCache/class-name
    listener
    class-scheme
    class-name
    com.tangosol.net.internal.AspNetSessionStoreProvider$SessionCleanupListener
    /class-name
    init-params
    init-param
    param-typecom.tangosol.net.BackingMapManagerContext/param-type
    param-value{manager-context}/param-value
    /init-param
    /init-params
    /class-scheme
    /listener
    /local-scheme
    /backing-map-scheme

    autostarttrue/autostart
    /distributed-scheme

    distributed-scheme
    scheme-nameaspnet-session-overflow-scheme/scheme-name
    scheme-refdist-default/scheme-ref
    service-nameAspNetSessionCache/service-name
    autostarttrue/autostart
    /distributed-scheme

    distributed-scheme
    scheme-namedist-default/scheme-name

    backing-map-scheme
    local-scheme/
    /backing-map-scheme

    autostarttrue/autostart
    /distributed-scheme

    replicated-scheme
    scheme-namerepl-default/scheme-name
    backing-map-scheme
    local-scheme/
    /backing-map-scheme

    autostarttrue/autostart
    /replicated-scheme

    proxy-scheme
    service-nameExtendTcpProxyService/service-name
    thread-count5/thread-count
    acceptor-config
    tcp-acceptor
    local-address
    addresslocalhost/address
    port9099/port
    /local-address
    /tcp-acceptor
    *use-filtersfilter-namegzip/filter-name/use-filters*
    /acceptor-config

    autostarttrue/autostart
    /proxy-scheme
    /caching-schemes
    /cache-config

    And I get

    2011-08-19 13:39:27.008/7012.815 Oracle Coherence GE 3.7.0.0 Error (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, me
    mber=2): An exception occurred while decoding a Message for Service=Proxy:ExtendTcpProxyService:TcpAcceptor received fro
    m: TcpConnection(Id=null, Open=true, LocalAddress=10.166.110.67:9099, RemoteAddress=10.166.110.54:3551): java.io.IOExcep
    tion: Not in GZIP format
    at java.util.zip.GZIPInputStream.readHeader(GZIPInputStream.java:143)
    at java.util.zip.GZIPInputStream.init(GZIPInputStream.java:58)
    at java.util.zip.GZIPInputStream.init(GZIPInputStream.java:67)
    at com.tangosol.net.CompressionFilter.getInputStream(CompressionFilter.java:57)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer.onNotify(Peer.CDB:54)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
    at java.lang.Thread.run(Thread.java:662)

    Edited by: 880448 on Aug 19, 2011 10:58 AM

    Edited by: 880448 on Aug 19, 2011 11:07 AM

    DB:2.74:Setting Up Gzip Compression To Optimize Text Throughtput fk

    Heh!, here's what I found in the 3.7 release notes
    http://download.oracle.com/docs/cd/E18686_01/coh.37/e21505/technotes.htm#BABHFDCB

    Network filters are deprecated and will be desupported. Current encryption filter implementations must be migrated to use SSL. There is no replacement for the compression filter.

  • RELEVANCY SCORE 2.74

    DB:2.74:Storing String Vs. Byte[] ? 1z


    It it more efficient to store a byte[] in a cache instead of a String? I'm just using the default serialization, no POF, etc.

    Thanks,
    Andrew

    DB:2.74:Storing String Vs. Byte[] ? 1z

    It will be marginally more efficient to store a byte[] (or Binary), because it's easier to read and write. A String is composed of 2-byte characters, but they are converted on-the-fly to UTF-8 format. Basically, for each 2-byte character, a 1-, 2- or 3-byte sequence will be created:

    if (ch = 0x0001 ch = 0x007F)
    {
    // 1-byte format: 0xxx xxxx
    ab[ofb++] = (byte) ch;
    }
    else if (ch = 0x07FF)
    {
    // 2-byte format: 110x xxxx, 10xx xxxx
    ab[ofb++] = (byte) (0xC0 | ((ch 6) 0x1F));
    ab[ofb++] = (byte) (0x80 | ((ch ) 0x3F));
    }
    else
    {
    // 3-byte format: 1110 xxxx, 10xx xxxx, 10xx xxxx
    ab[ofb++] = (byte) (0xE0 | ((ch 12) 0x0F));
    ab[ofb++] = (byte) (0x80 | ((ch 6) 0x3F));
    ab[ofb++] = (byte) (0x80 | ((ch ) 0x3F));
    }However, if the data you have to store is textual data, then use a String, and if it's binary data, then use byte[] (or Binary).

    Peace,

    Cameron Purdy | Oracle Coherence
    http://coherence.oracle.com/

  • RELEVANCY SCORE 2.73

    DB:2.73:Pof Compatibility Across Coherence Versions 9p


    Dear all,

    I am very interested in using POF serialization to synchronize two Coherence grids via JMS for its performance and because the OO-bytes serialization is already paid exchanging data between the business logic and the backing map.

    Before going in this direction, I would like to better understand POF characteristics (in addition to the wiki docs*) :

    - Will there be a POF compatibility between Coherence 3.5 and the forthcoming versions ?
    For example, will a POF written with Coherence 3.5.3 will be understandable by Coherence 3.6 ? 4.x ? Will the reverse be also true ? Will a POF written by Coherence 4.x be understandable by Coherence 3.5.3 ?

    - Is there a plan to make POF independant of Coherence ? Will it be possible to use POF in a pure weblogic-or-tomcat web app ? Will there be a downloadeable pof library with the associated licensing ?

    Thanks,

    Cyrille

    * http://coherence.oracle.com/display/COH35UG/The+Portable+Object+Format

    DB:2.73:Pof Compatibility Across Coherence Versions 9p

    I continued the discussion regarding POF annotations on the dedicated thread : {thread:id=1082413}.

    Cyrille

  • RELEVANCY SCORE 2.73

    DB:2.73:Using Struct Instead Of A Class For Pof Object Works And Mapping Char Array 9x


    Hi,
    Can't we use a structure as the data container rather than a object and use a sperate implementation for serialization. If we provide the data items with coherence compatible data types and equals, outputstream and hascode methods is it possible to use a struct?

    regards,
    Sura

    Edited by: sura on 12-Aug-2011 02:38

    DB:2.73:Using Struct Instead Of A Class For Pof Object Works And Mapping Char Array 9x

    In Java version, if you don't want to change the common business object to PortableObject, you can use PofSerializer thus your business object won't tie with Coherence class.

    Not sure if you can do the same thing in C++, you might want to check it out.

  • RELEVANCY SCORE 2.73

    DB:2.73:Performance Issue With Invocablemaphelper.Evaluateentry Deserializing Key ac


    --Hi, I'm using POF serialization, and I'm running an filter+aggregator with all POF extractors. Therefore, I would expect performance to be much better because the entry should not be deserialized. However, I'm still seeing poor performance. After profiling, I can see that about 50% of the time is spent deserializing the entry's key, and appears to be executed by Coherence itself, not by anything I wrote. Deserialization of the key is very unexpected, and seems unnecessary. Here's a screenshot of my VisualVM profiling http://imgur.com/Vpzql

    Does anyone have any idea why that call stack feels the need to deserialize the key?

    Does this sound like something that merits submitting a support request?--

    Nevermind!

    DB:2.73:Performance Issue With Invocablemaphelper.Evaluateentry Deserializing Key ac

    Hi rehevkor,

    1. Which exact Coherence version do you use?

    2. Could you please post your code for sending the aggregators?

    Best regards,

    Robert

  • RELEVANCY SCORE 2.73

    DB:2.73:Abap Dump "Data_Offset_Length_Too_Large" Facing Error While Processing Inbound Idocs 8z



    using batchjob for processing idocs with status 64 and 66(serialization).everything was working fine and then suddnely the job threw and ABAP Dump "DATA_OFFSET_LENGTH_TOO_LARGE" and got canceled

    kindly help

    facing the problem since 10 days

    everyday the job is getting canceled

    DB:2.73:Abap Dump "Data_Offset_Length_Too_Large" Facing Error While Processing Inbound Idocs 8z


    using batchjob for processing idocs with status 64 and 66(serialization).everything was working fine and then suddnely the job threw and ABAP Dump "DATA_OFFSET_LENGTH_TOO_LARGE" and got canceled

    kindly help

    facing the problem since 10 days

    everyday the job is getting canceled

  • RELEVANCY SCORE 2.73

    DB:2.73:Net Pof Client Connection To Cachesetver pj


    I am trying to setup a extend client to connection to cacheServer on Linux - Exception -

    2011-10-17 15:41:43.183/4.385 Oracle Coherence GE 3.7.1.0 D5 (thread=Invocation:Management, member=1): Service Management joined the cluster with senior service member 1
    2011-10-17 15:41:43.436/4.638 Oracle Coherence GE 3.7.1.0 Info (thread=DistributedCache, member=1): Loaded POF configuration from "jar:file:/app/titan/coherence3.7.1/cfg/cfg.jar!/custom-types-pof-config.xml"
    2011-10-17 15:41:43.487/4.690 Oracle Coherence GE 3.7.1.0 Info (thread=DistributedCache, member=1): Loaded included POF configuration from "jar:file:/app/titan/coherence3.7.1/lib/coherence.jar!/coherence-pof-config.xml"
    2011-10-17 15:41:43.560/4.762 Oracle Coherence GE 3.7.1.0 D5 (thread=DistributedCache, member=1): Service DistributedCache joined the cluster with senior service member 1
    2011-10-17 15:41:43.718/4.920 Oracle Coherence GE 3.7.1.0 Info (thread=Proxy:TcpProxyService:TcpAcceptor, member=1): TcpAcceptor now listening for connections on 169.35.226.18:9099
    2011-10-17 15:41:43.719/4.922 Oracle Coherence GE 3.7.1.0 D5 (thread=Proxy:TcpProxyService:TcpAcceptor, member=1): Started: TcpAcceptor{Name=Proxy:TcpProxyService:TcpAcceptor, State=(SERVICE_STARTED), ThreadCount=0, Codec=Codec(Format=POF), Serializer=com.tangosol.io.DefaultSerializer, PingInterval=0, PingTimeout=30000, RequestTimeout=30000, SocketProvider=SystemSocketProvider, LocalAddress=[pnl01a-2301/169.35.226.18:9099], SocketOptions{LingerTimeout=0, KeepAliveEnabled=true, TcpDelayEnabled=false}, ListenBacklog=0, BufferPoolIn=BufferPool(BufferSize=2KB, BufferType=DIRECT, Capacity=Unlimited), BufferPoolOut=BufferPool(BufferSize=2KB, BufferType=DIRECT, Capacity=Unlimited)}
    2011-10-17 15:41:43.727/4.929 Oracle Coherence GE 3.7.1.0 D5 (thread=Proxy:TcpProxyService, member=1): Service TcpProxyService joined the cluster with senior service member 1
    2011-10-17 15:41:43.730/4.932 Oracle Coherence GE 3.7.1.0 Info (thread=main, member=1):
    Services
    (
    ClusterService{Name=Cluster, State=(SERVICE_STARTED, STATE_JOINED), Id=0, Version=3.7.1, OldestMemberId=1}
    InvocationService{Name=Management, State=(SERVICE_STARTED), Id=1, Version=3.1, OldestMemberId=1}
    PartitionedCache{Name=DistributedCache, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=257, BackupCount=1, AssignedPartitions=257, BackupPartitions=0}
    ProxyService{Name=TcpProxyService, State=(SERVICE_STARTED), Id=3, Version=3.7, OldestMemberId=1}
    )

    Started DefaultCacheServer...

    2011-10-17 15:42:04.493/25.695 Oracle Coherence GE 3.7.1.0 Error (thread=Proxy:TcpProxyService:TcpAcceptor, member=1): An exception occurred while decoding a Message for Service=Proxy:TcpProxyService:TcpAcceptor received from: TcpConnection(Id=null, Open=true, LocalAddress=169.35.226.18:9099, RemoteAddress=169.37.14.55:3202): java.lang.IllegalStateException: TcpAcceptor{Name=Proxy:TcpProxyService:TcpAcceptor, State=(SERVICE_STARTED), ThreadCount=0, Codec=Codec(Format=POF), Serializer=com.tangosol.io.DefaultSerializer, PingInterval=0, PingTimeout=30000, RequestTimeout=30000, SocketProvider=SystemSocketProvider, LocalAddress=[pnl01a-2301/169.35.226.18:9099], SocketOptions{LingerTimeout=0, KeepAliveEnabled=true, TcpDelayEnabled=false}, ListenBacklog=0, BufferPoolIn=BufferPool(BufferSize=2KB, BufferType=DIRECT, Capacity=Unlimited), BufferPoolOut=BufferPool(BufferSize=2KB, BufferType=DIRECT, Capacity=Unlimited)} has not been configured with a PofContext; this channel cannot decode POF-encoded user types
    at com.tangosol.coherence.component.net.extend.Channel.getPofSerializer(Channel.CDB:25)
    at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3308)
    at com.tangosol.io.pof.PofBufferReader.readObject(PofBufferReader.java:2604)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer$MessageFactory$OpenConnectionRequest.readExternal(Peer.CDB:7)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.peer.acceptor.TcpAcceptor$MessageFactory$OpenConnectionRequest.readExternal(TcpAcceptor.CDB:1)
    at com.tangosol.coherence.component.net.extend.Codec.decode(Codec.CDB:29)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer.decodeMessage(Peer.CDB:25)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer.onNotify(Peer.CDB:54)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
    at java.lang.Thread.run(Thread.java:662)

    Server Config -

    !DOCTYPE cache-config SYSTEM "cache-config.dtd"
    cache-config
    caching-scheme-mapping
    cache-mapping
    cache-namedist-*/cache-name
    scheme-nameexample-distributed/scheme-name
    init-params
    init-param
    param-nameback-size-limit/param-name
    param-value8MB/param-value
    /init-param
    /init-params
    /cache-mapping

    cache-mapping
    cache-namenear-*/cache-name
    scheme-nameexample-near/scheme-name
    init-params
    init-param
    param-nameback-size-limit/param-name
    param-value8MB/param-value
    /init-param
    /init-params
    /cache-mapping

    cache-mapping
    cache-name*/cache-name
    scheme-nameexample-distributed/scheme-name
    /cache-mapping
    /caching-scheme-mapping
    caching-schemes
    !-- Distributed caching scheme. --
    distributed-scheme
    scheme-nameexample-distributed/scheme-name
    service-nameDistributedCache/service-name
    serializer
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    init-params
    init-param
    param-typestring/param-type
    param-valuecustom-types-pof-config.xml/param-value
    /init-param
    /init-params

    /serializer
    !-- END Required for POF Serialization --

    backing-map-scheme
    local-scheme
    scheme-refexample-binary-backing-map/scheme-ref
    /local-scheme
    /backing-map-scheme
    autostarttrue/autostart
    /distributed-scheme

    !--
    Proxy Service scheme that allows remote clients to connect to the
    cluster over TCP/IP.
    --
    proxy-scheme
    scheme-nameexample-proxy/scheme-name
    service-nameTcpProxyService/service-name
    acceptor-config
    tcp-acceptor
    local-address
    address system-property="tangosol.coherence.extend.address"169.35.226.18/address
    port system-property="tangosol.coherence.extend.port"9099/port
    /local-address
    /tcp-acceptor
    /acceptor-config

    proxy-config
    cache-service-proxy
    enabledtrue/enabled
    /cache-service-proxy
    invocation-service-proxy
    enabledtrue/enabled
    /invocation-service-proxy
    /proxy-config

    autostart system-property="tangosol.coherence.extend.enabled"true/autostart
    /proxy-scheme
    /caching-schemes

    /cache-config

    Server POF Custom Types.xml

    ?xml version="1.0"?
    pof-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns="http://xmlns.oracle.com/coherence/coherence-pof-config"
    xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-pof-config coherence-pof-config.xsd"
    user-type-list
    !-- coherence POF user types --
    includecoherence-pof-config.xml/include
    !-- include all application POF user types --
    user-type
    type-id1001/type-id
    class-nameexamples.ContactInfo/class-name
    /user-type
    /user-type-list

    allow-interfacestrue/allow-interfaces
    allow-subclassestrue/allow-subclasses
    /pof-config

    Client - pof-config.xml?xml version="1.0"?
    pof-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns="http://xmlns.oracle.com/coherence/coherence-pof-config"
    xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-pof-config coherence-pof-config.xsd"
    user-type-list
    !-- coherence POF user types --
    includecoherence-pof-config.xml/include
    !-- include all application POF user types --
    user-type
    type-id1001/type-id
    class-nameexamples.ContactInfo/class-name
    /user-type
    /user-type-list

    allow-interfacestrue/allow-interfaces
    allow-subclassestrue/allow-subclasses
    /pof-config

    Client - cache-config.xml

    ?xml version="1.0"?

    cache-config xmlns="http://schemas.tangosol.com/cache"
    caching-scheme-mapping
    cache-mapping
    cache-namedist-contact-cache/cache-name
    scheme-nameextend-direct/scheme-name
    /cache-mapping
    /caching-scheme-mapping
    caching-schemes
    remote-cache-scheme
    scheme-nameextend-direct/scheme-name
    service-nameExtendTcpCacheService/service-name
    initiator-config
    tcp-initiator
    remote-addresses
    socket-address
    address169.35.226.18/address
    port9099/port
    /socket-address
    /remote-addresses
    /tcp-initiator
    outgoing-message-handler
    request-timeout30s/request-timeout
    /outgoing-message-handler
    /initiator-config
    /remote-cache-scheme
    /caching-schemes
    /cache-config

    I see the pof config is loaded properly.

    Any help will be appreciated.

    DB:2.73:Net Pof Client Connection To Cachesetver pj

    You need to add a serializer to the proxy scheme

    Proxy Service scheme that allows remote clients to connect to the
    cluster over TCP/IP.
    --
    proxy-scheme
    scheme-nameexample-proxy/scheme-name
    service-nameTcpProxyService/service-name
    acceptor-config
    serializer
    instance
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    init-params
    init-param
    param-value system-property="odc.tangosol.pof.config"custom-types-pof-config.xml/param-value
    param-typeString/param-type
    /init-param
    /init-params
    /instance
    /serializer

    tcp-acceptor
    local-address
    address system-property="tangosol.coherence.extend.address"169.35.226.18/address
    port system-property="tangosol.coherence.extend.port"9099/port
    /local-address
    /tcp-acceptor
    /acceptor-config

    proxy-config
    cache-service-proxy
    enabledtrue/enabled
    /cache-service-proxy
    invocation-service-proxy
    enabledtrue/enabled
    /invocation-service-proxy
    /proxy-config

    autostart system-property="tangosol.coherence.extend.enabled"true/autostart
    /proxy-schemeI assume this is the same error as your other post.

    JK

  • RELEVANCY SCORE 2.73

    DB:2.73:Entry Processor Performance And Serialization 8c


    Our application invokes an entry processor from a .Net extend client. We are seeing very poor performance the first few times the processor is invoked (typically 60ms vs 2ms). I was expecting this to be due to the overhead of the initial TCP connection but this doesn't appear to be the case. Here is a typical set of timings from the beginning of invoking the EP on the client:

    5ms: Object starts being deserialized on proxy
    7ms: Deserialization is finished, and serialization begins
    9ms: Serialization is complete.
    55ms: Deserialization begins on the storage node
    58ms: Deserialization complete on storage node
    58.5: Entry processor invocation complete
    60ms: Serialization of result is complete

    My questions are:
    - What is happening between the end of serialization of the object by the proxy and deserialization starting on the storage node?
    - Given that the cache is POF enabled, why is the proxy deserializing and reserializing the entry processor?

    Version is 3.7.1.6

    DB:2.73:Entry Processor Performance And Serialization 8c

    Looks like you are using cache-service-proxy.

    Are you intercept the invoke() operation and doing something which forced the proxynode to deserialize the entryprocessor?

  • RELEVANCY SCORE 2.73

    DB:2.73:Session Serialization In Bo 4.1 1c



    Dear All,

    Can you guide me to implement session serialization in open document URL.

    I`m getting the below mentioned error for open document,

    HTTP Status 500 - java.lang.RuntimeException: org.apache.jasper.JasperException: java.lang.NullPointerException: while trying to invoke the method com.crystaldecisions.sdk.framework.IEnterpriseSession.getClusterName() of a null object loaded from local variable 'session'

    I have refered the below mentioned URL for open document...I didn`t get it..

    https://help.sap.com/businessobject/product_guides/sbo41/en/sbo41_opendocument_en.pdf

    Please find the attachment to get the exact error msg to fix the issue.

    The platform is installed JAVA version..In which path do i need to change the session token?? is there any SDK to install to session serialization?

    Thanks,

    Sella Perumal P

    DB:2.73:Session Serialization In Bo 4.1 1c


    Dear All,

    I have fixed the above mentioned Issue...Thanks for all the replies...

  • RELEVANCY SCORE 2.72

    DB:2.72:Getting "Invalid Type: 169" Errors When Using Pof With Push Replication mz


    I'm trying to get Push Replication - latest version - running on Coherence 3.6.1. I can get it working fine if I don't use POF with my objects, but when trying to use POF format for my objects I get this:

    2011-02-11 13:06:00.993/2.297 Oracle Coherence GE 3.6.1.1 D5 (thread=Invocation:Management, member=1): Service Management joined the cluster with senior service member 1
    2011-02-11 13:06:01.149/2.453 Oracle Coherence GE 3.6.1.1 Info (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded POF configuration from "file:/C:/wsgpc/GlobalPositionsCache/resource/coherence/pof-config.xml"
    2011-02-11 13:06:01.149/2.453 Oracle Coherence GE 3.6.1.1 Info (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded included POF configuration from "jar:file:/C:/coherence3.6/coherence/lib/coherence.jar!/coherence-pof-config.xml"
    2011-02-11 13:06:01.149/2.453 Oracle Coherence GE 3.6.1.1 Info (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded included POF configuration from "jar:file:/C:/coherence3.6-pushreplication/coherence-3.6-common-1.7.3.20019.jar!/coherence-common-pof-config.xml"
    2011-02-11 13:06:01.165/2.469 Oracle Coherence GE 3.6.1.1 Info (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded included POF configuration from "jar:file:/C:/coherence3.6-pushreplication/coherence-3.6-messagingpattern-2.7.4.21016.jar!/coherence-messagingpattern-pof-config.xml"
    2011-02-11 13:06:01.165/2.469 Oracle Coherence GE 3.6.1.1 Info (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded included POF configuration from "jar:file:/C:/coherence3.6-pushreplication/coherence-3.6-pushreplicationpattern-3.0.3.20019.jar!/coherence-pushreplicationpattern-pof-config.xml"
    2011-02-11 13:06:01.243/2.547 Oracle Coherence GE 3.6.1.1 D5 (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Service DistributedCacheForSequenceGenerators joined the cluster with senior service member 1
    2011-02-11 13:06:01.258/2.562 Oracle Coherence GE 3.6.1.1 D5 (thread=DistributedCache:DistributedCacheForLiveObjects, member=1): Service DistributedCacheForLiveObjects joined the cluster with senior service member 1
    2011-02-11 13:06:01.274/2.578 Oracle Coherence GE 3.6.1.1 D5 (thread=DistributedCache:DistributedCacheForSubscriptions, member=1): Service DistributedCacheForSubscriptions joined the cluster with senior service member 1
    2011-02-11 13:06:01.290/2.594 Oracle Coherence GE 3.6.1.1 D5 (thread=DistributedCache:DistributedCacheForMessages, member=1): Service DistributedCacheForMessages joined the cluster with senior service member 1
    2011-02-11 13:06:01.305/2.609 Oracle Coherence GE 3.6.1.1 D5 (thread=DistributedCache:DistributedCacheForDestinations, member=1): Service DistributedCacheForDestinations joined the cluster with senior service member 1
    2011-02-11 13:06:01.305/2.609 Oracle Coherence GE 3.6.1.1 D5 (thread=DistributedCache:DistributedCacheWithPublishingCacheStore, member=1): Service DistributedCacheWithPublishingCacheStore joined the cluster with senior service member 1
    2011-02-11 13:06:01.321/2.625 Oracle Coherence GE 3.6.1.1 D5 (thread=DistributedCache, member=1): Service DistributedCache joined the cluster with senior service member 1
    2011-02-11 13:06:01.461/2.765 Oracle Coherence GE 3.6.1.1 Info (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): TcpAcceptor now listening for connections on 166.15.224.91:20002
    2011-02-11 13:06:01.461/2.765 Oracle Coherence GE 3.6.1.1 D5 (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): Started: TcpAcceptor{Name=Proxy:ExtendTcpProxyService:TcpAcceptor, State=(SERVICE_STARTED), ThreadCount=0, Codec=Codec(Format=POF), Serializer=com.tangosol.io.DefaultSerializer, PingInterval=0, PingTimeout=0, RequestTimeout=0, SocketProvider=SystemSocketProvider, LocalAddress=[/166.15.224.91:20002], SocketOptions{LingerTimeout=0, KeepAliveEnabled=true, TcpDelayEnabled=false}, ListenBacklog=0, BufferPoolIn=BufferPool(BufferSize=2KB, BufferType=DIRECT, Capacity=Unlimited), BufferPoolOut=BufferPool(BufferSize=2KB, BufferType=DIRECT, Capacity=Unlimited)}
    2011-02-11 13:06:01.461/2.765 Oracle Coherence GE 3.6.1.1 D5 (thread=Proxy:ExtendTcpProxyService, member=1): Service ExtendTcpProxyService joined the cluster with senior service member 1
    2011-02-11 13:06:01.461/2.765 Oracle Coherence GE 3.6.1.1 Info (thread=main, member=1):
    Services
    (
    ClusterService{Name=Cluster, State=(SERVICE_STARTED, STATE_JOINED), Id=0, Version=3.6, OldestMemberId=1}
    InvocationService{Name=Management, State=(SERVICE_STARTED), Id=1, Version=3.1, OldestMemberId=1}
    PartitionedCache{Name=DistributedCacheForSequenceGenerators, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=257, BackupCount=1, AssignedPartitions=257, BackupPartitions=0}
    PartitionedCache{Name=DistributedCacheForLiveObjects, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=257, BackupCount=1, AssignedPartitions=257, BackupPartitions=0}
    PartitionedCache{Name=DistributedCacheForSubscriptions, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=257, BackupCount=1, AssignedPartitions=257, BackupPartitions=0}
    PartitionedCache{Name=DistributedCacheForMessages, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=257, BackupCount=1, AssignedPartitions=257, BackupPartitions=0}
    PartitionedCache{Name=DistributedCacheForDestinations, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=257, BackupCount=1, AssignedPartitions=257, BackupPartitions=0}
    PartitionedCache{Name=DistributedCacheWithPublishingCacheStore, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=257, BackupCount=1, AssignedPartitions=257, BackupPartitions=0}
    PartitionedCache{Name=DistributedCache, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=257, BackupCount=1, AssignedPartitions=257, BackupPartitions=0}
    ProxyService{Name=ExtendTcpProxyService, State=(SERVICE_STARTED), Id=9, Version=3.2, OldestMemberId=1}
    )

    Started DefaultCacheServer...

    2011-02-11 13:08:27.894/149.198 Oracle Coherence GE 3.6.1.1 Error (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): Failed to publish EntryOperation{siteName=csfb.cs-group.com, clusterName=SPTestCluster, cacheName=source-cache, operation=Insert, publishableEntry=PublishableEntry{key=Binary(length=32, value=0x15A90F00004E07424F4F4B303038014E08494E535430393834024E0345535040), value=Binary(length=147, value=0x1281A30115AA0F0000A90F00004E07424F4F4B303038014E08494E535430393834024E03455350400248ADEEF99607060348858197BF22060448B4D8E9BE02060548A0D2CDC70E060648B0E9A2C4030607488DBCD6E50D060848B18FC1882006094E03303038402B155B014E0524737263244E1F637366622E63732D67726F75702E636F6D2D535054657374436C7573746572), originalValue=Binary(length=0, value=0x)}} to Cache passive-cache because of
    (Wrapped) java.io.StreamCorruptedException: invalid type: 169 Class:com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher
    2011-02-11 13:08:27.894/149.198 Oracle Coherence GE 3.6.1.1 D5 (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): An exception occurred while processing a InvocationRequest for Service=Proxy:ExtendTcpProxyService:TcpAcceptor: (Wrapped: Failed to publish a batch with the publisher [Active Publisher] on cache [source-cache]) java.lang.IllegalStateException: Attempted to publish to cache passive-cache
    at com.tangosol.util.Base.ensureRuntimeException(Base.java:293)
    at com.oracle.coherence.patterns.pushreplication.publishers.RemoteClusterPublisher$RemotePublishingAgent.run(RemoteClusterPublisher.java:348)
    at com.tangosol.coherence.component.net.extend.proxy.serviceProxy.InvocationServiceProxy.query(InvocationServiceProxy.CDB:6)
    at com.tangosol.coherence.component.net.extend.messageFactory.InvocationServiceFactory$InvocationRequest.onRun(InvocationServiceFactory.CDB:12)
    at com.tangosol.coherence.component.net.extend.message.Request.run(Request.CDB:4)
    at com.tangosol.coherence.component.net.extend.proxy.serviceProxy.InvocationServiceProxy.onMessage(InvocationServiceProxy.CDB:9)
    at com.tangosol.coherence.component.net.extend.Channel.execute(Channel.CDB:39)
    at com.tangosol.coherence.component.net.extend.Channel.receive(Channel.CDB:26)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer.onNotify(Peer.CDB:103)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
    at java.lang.Thread.run(Thread.java:662)
    Caused by: java.lang.IllegalStateException: Attempted to publish to cache passive-cache
    at com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher.publishBatch(AbstractCachePublisher.java:163)
    at com.oracle.coherence.patterns.pushreplication.publishers.RemoteClusterPublisher$RemotePublishingAgent.run(RemoteClusterPublisher.java:343)
    ... 9 more
    Caused by: (Wrapped) java.io.StreamCorruptedException: invalid type: 169
    at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:265)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService$ConverterKeyToBinary.convert(PartitionedService.CDB:16)
    at com.tangosol.util.ConverterCollections$ConverterInvocableMap.invoke(ConverterCollections.java:2156)
    at com.tangosol.util.ConverterCollections$ConverterNamedCache.invoke(ConverterCollections.java:2622)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.invoke(PartitionedCache.CDB:11)
    at com.tangosol.coherence.component.util.SafeNamedCache.invoke(SafeNamedCache.CDB:1)
    at com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher.publishBatch(AbstractCachePublisher.java:142)
    ... 10 more
    Caused by: java.io.StreamCorruptedException: invalid type: 169
    at com.tangosol.util.ExternalizableHelper.readObjectInternal(ExternalizableHelper.java:2265)
    at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2253)
    at com.tangosol.io.DefaultSerializer.deserialize(DefaultSerializer.java:74)
    at com.tangosol.util.ExternalizableHelper.deserializeInternal(ExternalizableHelper.java:2703)
    at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:261)
    ... 16 more

    2011-02-11 13:08:37.925/159.229 Oracle Coherence GE 3.6.1.1 Error (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): Failed to publish EntryOperation{siteName=csfb.cs-group.com, clusterName=SPTestCluster, cacheName=source-cache, operation=Insert, publishableEntry=PublishableEntry{key=Binary(length=32, value=0x15A90F00004E07424F4F4B303038014E08494E535430393834024E0345535040), value=Binary(length=147, value=0x1281A30115AA0F0000A90F00004E07424F4F4B303038014E08494E535430393834024E03455350400248ADEEF99607060348858197BF22060448B4D8E9BE02060548A0D2CDC70E060648B0E9A2C4030607488DBCD6E50D060848B18FC1882006094E03303038402B155B014E0524737263244E1F637366622E63732D67726F75702E636F6D2D535054657374436C7573746572), originalValue=Binary(length=0, value=0x)}} to Cache passive-cache because of
    (Wrapped) java.io.StreamCorruptedException: invalid type: 169 Class:com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher
    2011-02-11 13:08:37.925/159.229 Oracle Coherence GE 3.6.1.1 D5 (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): An exception occurred while processing a InvocationRequest for Service=Proxy:ExtendTcpProxyService:TcpAcceptor: (Wrapped: Failed to publish a batch with the publisher [Active Publisher] on cache [source-cache]) java.lang.IllegalStateException: Attempted to publish to cache passive-cache
    at com.tangosol.util.Base.ensureRuntimeException(Base.java:293)
    at com.oracle.coherence.patterns.pushreplication.publishers.RemoteClusterPublisher$RemotePublishingAgent.run(RemoteClusterPublisher.java:348)
    at com.tangosol.coherence.component.net.extend.proxy.serviceProxy.InvocationServiceProxy.query(InvocationServiceProxy.CDB:6)
    at com.tangosol.coherence.component.net.extend.messageFactory.InvocationServiceFactory$InvocationRequest.onRun(InvocationServiceFactory.CDB:12)
    at com.tangosol.coherence.component.net.extend.message.Request.run(Request.CDB:4)
    at com.tangosol.coherence.component.net.extend.proxy.serviceProxy.InvocationServiceProxy.onMessage(InvocationServiceProxy.CDB:9)
    at com.tangosol.coherence.component.net.extend.Channel.execute(Channel.CDB:39)
    at com.tangosol.coherence.component.net.extend.Channel.receive(Channel.CDB:26)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer.onNotify(Peer.CDB:103)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
    at java.lang.Thread.run(Thread.java:662)
    Caused by: java.lang.IllegalStateException: Attempted to publish to cache passive-cache
    at com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher.publishBatch(AbstractCachePublisher.java:163)
    at com.oracle.coherence.patterns.pushreplication.publishers.RemoteClusterPublisher$RemotePublishingAgent.run(RemoteClusterPublisher.java:343)
    ... 9 more
    Caused by: (Wrapped) java.io.StreamCorruptedException: invalid type: 169
    at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:265)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService$ConverterKeyToBinary.convert(PartitionedService.CDB:16)
    at com.tangosol.util.ConverterCollections$ConverterInvocableMap.invoke(ConverterCollections.java:2156)
    at com.tangosol.util.ConverterCollections$ConverterNamedCache.invoke(ConverterCollections.java:2622)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.invoke(PartitionedCache.CDB:11)
    at com.tangosol.coherence.component.util.SafeNamedCache.invoke(SafeNamedCache.CDB:1)
    at com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher.publishBatch(AbstractCachePublisher.java:142)
    ... 10 more
    Caused by: java.io.StreamCorruptedException: invalid type: 169
    at com.tangosol.util.ExternalizableHelper.readObjectInternal(ExternalizableHelper.java:2265)
    at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2253)
    at com.tangosol.io.DefaultSerializer.deserialize(DefaultSerializer.java:74)
    at com.tangosol.util.ExternalizableHelper.deserializeInternal(ExternalizableHelper.java:2703)
    at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:261)
    ... 16 more

    2011-02-11 13:08:47.940/169.244 Oracle Coherence GE 3.6.1.1 Error (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): Failed to publish EntryOperation{siteName=csfb.cs-group.com, clusterName=SPTestCluster, cacheName=source-cache, operation=Insert, publishableEntry=PublishableEntry{key=Binary(length=32, value=0x15A90F00004E07424F4F4B303038014E08494E535430393834024E0345535040), value=Binary(length=147, value=0x1281A30115AA0F0000A90F00004E07424F4F4B303038014E08494E535430393834024E03455350400248ADEEF99607060348858197BF22060448B4D8E9BE02060548A0D2CDC70E060648B0E9A2C4030607488DBCD6E50D060848B18FC1882006094E03303038402B155B014E0524737263244E1F637366622E63732D67726F75702E636F6D2D535054657374436C7573746572), originalValue=Binary(length=0, value=0x)}} to Cache passive-cache because of
    (Wrapped) java.io.StreamCorruptedException: invalid type: 169 Class:com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher
    2011-02-11 13:08:47.940/169.244 Oracle Coherence GE 3.6.1.1 D5 (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): An exception occurred while processing a InvocationRequest for Service=Proxy:ExtendTcpProxyService:TcpAcceptor: (Wrapped: Failed to publish a batch with the publisher [Active Publisher] on cache [source-cache]) java.lang.IllegalStateException: Attempted to publish to cache passive-cache
    at com.tangosol.util.Base.ensureRuntimeException(Base.java:293)
    at com.oracle.coherence.patterns.pushreplication.publishers.RemoteClusterPublisher$RemotePublishingAgent.run(RemoteClusterPublisher.java:348)
    at com.tangosol.coherence.component.net.extend.proxy.serviceProxy.InvocationServiceProxy.query(InvocationServiceProxy.CDB:6)
    at com.tangosol.coherence.component.net.extend.messageFactory.InvocationServiceFactory$InvocationRequest.onRun(InvocationServiceFactory.CDB:12)
    at com.tangosol.coherence.component.net.extend.message.Request.run(Request.CDB:4)
    at com.tangosol.coherence.component.net.extend.proxy.serviceProxy.InvocationServiceProxy.onMessage(InvocationServiceProxy.CDB:9)
    at com.tangosol.coherence.component.net.extend.Channel.execute(Channel.CDB:39)
    at com.tangosol.coherence.component.net.extend.Channel.receive(Channel.CDB:26)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer.onNotify(Peer.CDB:103)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
    at java.lang.Thread.run(Thread.java:662)
    Caused by: java.lang.IllegalStateException: Attempted to publish to cache passive-cache
    at com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher.publishBatch(AbstractCachePublisher.java:163)
    at com.oracle.coherence.patterns.pushreplication.publishers.RemoteClusterPublisher$RemotePublishingAgent.run(RemoteClusterPublisher.java:343)
    ... 9 more
    Caused by: (Wrapped) java.io.StreamCorruptedException: invalid type: 169
    at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:265)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService$ConverterKeyToBinary.convert(PartitionedService.CDB:16)
    at com.tangosol.util.ConverterCollections$ConverterInvocableMap.invoke(ConverterCollections.java:2156)
    at com.tangosol.util.ConverterCollections$ConverterNamedCache.invoke(ConverterCollections.java:2622)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.invoke(PartitionedCache.CDB:11)
    at com.tangosol.coherence.component.util.SafeNamedCache.invoke(SafeNamedCache.CDB:1)
    at com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher.publishBatch(AbstractCachePublisher.java:142)
    ... 10 more
    Caused by: java.io.StreamCorruptedException: invalid type: 169
    at com.tangosol.util.ExternalizableHelper.readObjectInternal(ExternalizableHelper.java:2265)
    at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2253)
    at com.tangosol.io.DefaultSerializer.deserialize(DefaultSerializer.java:74)
    at com.tangosol.util.ExternalizableHelper.deserializeInternal(ExternalizableHelper.java:2703)
    at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:261)
    ... 16 more

    It seems to be loading my POF configuration file - which also includes the standard Coherence ones as well as those required for PR - just fine, as you can see at the top of the trace.

    Any ideas why POF format for my objects is giving this error (NB. I've tested the POF stuff outside of PR and it all works fine.)

    EDIT: I've tried switching the "publisher" to the "file" publisher in PR. And that works fine. I see my POF format cached data extracted and published to the directory I specify. So the "publish" part seems to work when I use a file-publisher.

    Cheers,

    Steve

    DB:2.72:Getting "Invalid Type: 169" Errors When Using Pof With Push Replication mz

    Cheers, Neville, you've given me a lot to look at.

    I find it VERY confusing trying to sift my way through the Push Replication stuff! Documentation seems to be sorely lacking, IMHO.

    I had a Configurable POF Context, before I started using PR. That, of course, worked fine. I've had it both "in" and "out" of my config, but it doesn't help. Also, if you look through the configs in the Incubator examples, they don't have it either, so I commented it out again. For both my sending and recieving cluster I've tried to follow thr examples and their cache config files as closely as possible.

    Anyway...

    Based on your earlier comments about "what command-line options was I using" I decided to play around with some! :)

    I noticed in the Coherence 3.5 book it talks about POF not being switched-on by default. So, I tried throwing:

    -Dtangosol.pof.enabled=true

    At my 2 servers and the test progrma that puts rows in the first cluster. And it works now. Result! Mind you, I couldn't find this documented anywhere for the Incubator PR examples (even a search of the files doesn't show it.

    Oh well, need to look into this more next week. At least I got it working before the weekend.

    If you don't mind, I'll update this thread on Monday, just so I can get your thoughts.

    Cheers,

    Steve

  • RELEVANCY SCORE 2.72

    DB:2.72:Use Pof Serialization In .Net 1k


    Hello,

    I have a class called MyClass and pof configuration for this type (my-pof-config.xml).
    I need to serialize an instance of MyType and then send it via JMS.

    In Coherence Java API, there is ExternalizableHelper.toByteArray/fromByteArray. How can I do pof serialization and deserialization in C#?

    Thank you.

    DB:2.72:Use Pof Serialization In .Net 1k

    Thanks to Alexey Ragozin. Solution is below:

    // serialize
    val a = new MyObject
    val cpc = new ConfigurablePofContext("my-pof-config.xml")
    val buf = new ByteArrayWriteBuffer(4096)
    val out = buf.getBufferOutput()
    cpc.serialize(out, a)
    val bin = buf.toBinary()

    // deserialize
    val inb = new ByteArrayReadBuffer(bin.toByteArray)
    val in = inb.getBufferInput
    println(cpc.deserialize(in).asInstanceOf[MyObject])

    Cheers,
    Alexander Nemish

    Edited by: 944906 on 06.07.2012 9:37

  • RELEVANCY SCORE 2.72

    DB:2.72:Error While Running Some Reports x1



    Hi All,

    We are facing issues while running some reports in BI production portal and below are some error messages.

    --Access Error: Aithorization Check for caller assignment to j2ee security role (service.naming:jndi_all_operations] referncing j2ee security role [SAP -J2ee -engine :administrators].

    --serialization failed for object (PCD location)

    --The metadata of CMD "open_Dialog_DLg_Variable" are incorrect for parameter "Target_Dialog_Ref"

    Kindly help me to resolve the above issue .

    Regards,

    Nithya

    DB:2.72:Error While Running Some Reports x1


    Can you check SU53 sounds authorization is required for some roles? You can talk to Basis team for other errors

  • RELEVANCY SCORE 2.72

    DB:2.72:Packaging Pof Object Class In A Jar File 7c


    I have the following issue packaging a POF class.

    I have two POF classes which are defined in the package oracle.communications.activation.asap.ace;
    1. Token.java
    2. Asdl.java

    For simplicity i package them in the coherence.jar along with the other nessary artifacts.

    "tokens-pof-config.xml"

    ?xml version="1.0"?
    pof-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns="http://xmlns.oracle.com/coherence/coherence-pof-config"
    xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-pof-config coherence-pof-config.xsd"
    user-type-list
    !-- coherence POF user types --
    includecoherence-pof-config.xml/include
    !-- com.tangosol.examples package --
    user-type
    type-id1001/type-id
    class-nameToken/class-name
    /user-type
    user-type
    type-id1002/type-id
    class-nameAsdl/class-name
    /user-type
    /user-type-list
    allow-interfacestrue/allow-interfaces
    allow-subclassestrue/allow-subclasses
    /pof-config

    Then I startup the CacheServer it startsup fine. However when I invoke the POF classes from my weblogic artifacts I get the following error message.

    Oct 26, 2011 10:04:24 PM PDT Warning EJB BEA-010065 MessageDrivenBean threw an Exception in onMessage(). The exception was:
    (Wrapped) java.io.IOException: unknown user type: oracle.communications.activation.asap.ace.Token.
    (Wrapped) java.io.IOException: unknown user type: oracle.communications.activation.asap.ace.Token
    .

    ======================================================================================================================

    If I however define the classes in the default package or no package instead of package oracle.communications.activation.asap.ace;

    Keeping everything else the same everything works fine.
    ====================================================================================
    What do I need to do to make this work if I am packaging my POF classes not in the default package ?

    DB:2.72:Packaging Pof Object Class In A Jar File 7c

    If you got this error message

    (wrong name: Token)

    Most of time it indicate that your Token class found in the classpath doesn't not have package defined.

    I know you have changed your Token.java to include the package name. However, for whatever reason, the coherence.jar still contains a Token class without package name.

    You might want to grab someone who know Java package/classpath and check out your "coherence.jar".

  • RELEVANCY SCORE 2.71

    DB:2.71:C++ Illegalstateexception Using Pofwriter::Setversionid 19


    Using C++ and Coherence 3.7: -

    Im a Coherence newbie and have a simple class which has two member integers and one std::string. I use ManagedT style of serialization which works OK.

    template void serializeFoo(PofWriter::Handle hOut, const Foo foo)
    {
    // Write to the cache.
    hOut-setVersionId(0);
    hOut-writeInt32(1, foo.GetInt());
    hOut-writeInt32(2, foo.GetAnotherInt());
    hOut-writeString(3, foo.GetString());
    }

    This serialization (and corresponding deserialization) is invoked and works in our main application data is written to and read from our cache.
    -----------------------
    In a separate test harness (C++ command line app) I want to invoke the serialization and study the binary serialized data.

    // === FailingCode ==================================================

    // An array of bytes to hold the output of serializing the Foo...
    Array octet_t ::Handle hArray = Arrayoctet_t::create();

    // ...and a buffer wrappering that array...
    coherence::io::OctetArrayWriteBuffer::Handle hOctArrWriteBuffer = coherence::io::OctetArrayWriteBuffer::create(hArray);

    // ...and an output buffer wrappering that buffer...
    WriteBuffer::BufferOutput::Handle hOctArrBufferOutput = coherence::io::OctetArrayWriteBuffer::OctetArrayBufferOutput::create(hOctArrWriteBuffer);

    // ...and a write buffer wrappering that buffer.
    coherence::io::pof::PofBufferWriter::Handle hPofBufferWriter = coherence::io::pof::PofBufferWriter::create(hOctArrBufferOutput, coherence::io::pof::SystemPofContext::getInstance());

    // Directly call the serialization method for Foo.
    Foo fooToWrite;
    serializeFoo(hPofBufferWriter, fooToWrite); // Calling this throws an exception - see below.

    // === /FailingCode ==================================================

    When invoked by the code above, the setVersionId call throws*. This is using exactly the same serilaization code as works in the main application.

    hOut-setVersionId(0); // Throws coherence::lang::IllegalStateException - "not in a user type"
    --------------------------------------
    Questions: -

    1) What does this exception mean in this context?
    2) How do I resolve this? i.e. how do I explicitly invoke serialization in a test harness without exceptions being raised and without writing to a cache?

    Thanks.

    Edited by: DonLonDon on Feb 14, 2012 3:51 PM

    DB:2.71:C++ Illegalstateexception Using Pofwriter::Setversionid 19

    Thanks Robert.

    In case anyone else is trying to do this the code now looks something like this...
    ----------------------------------------------------------------------------------
    ManagedFoo::View vOriginalFoo = ManagedFoo::create();

    // Get the system PofContext. On compilation the Foo class should have automatically been
    // registered with the system POF context. Test that the class exists and is registered.
    // These calls will throw if it isn't.
    const int pofTypeIdFoo(constants::POF_ID_FOO);

    // Get the system POF context. This should know about Foo.
    const coherence::io::pof::SystemPofContext::View vSystemPofContext = coherence::io::pof::SystemPofContext::getInstance();

    // Get the Foo class (i.e. type info and serializer).
    const coherence::Class::View vClass = vSystemPofContext-getClass(pofTypeIdFoo);
    const PofSerializer::View vSerializer = vSystemPofContext-getPofSerializer(pofTypeIdFoo);

    // A SimplePofContext to perform the serialization.
    const coherence::io::pof::SimplePofContext::Handle hPofContext = coherence::io::pof::SimplePofContext::create();

    // Register the Foo type and serializer with the PofContext.
    hPofContext-registerUserType(pofTypeIdFoo, vClass, vSerializer);

    // Serialize the data using a serialization helper.
    const coherence::util::Binary::View vBinary = coherence::util::SerializationHelper::toBinary(vOriginalFoo, hPofContext);

    // Get the number of bytes serialized.
    const size32_t serializedBinaryLength(vBinary-length());

    // Test that we serialized something...
    CPPUNIT_ASSERT(serializedBinaryLength 0);

    // Bigger tests go here...

  • RELEVANCY SCORE 2.70

    DB:2.70:Question About Putting Multiple Pof Types In One Cache dj


    Single POF type works for me in one cache. I was trying add one more POF type into the cache, however got the error message saying the second type was not found (java.lang.ClassNotFoundException). The procedure I tried is as follows.

    Initially, I had Class1 in Package1, which worked fine. I added Class2 into Package1, and then compiled and created JAR file successfully. The custom-types-pof-config.xml is like

    ?xml version="1.0"?

    !DOCTYPE pof-config SYSTEM "pof-config.dtd"

    pof-config
    user-type-list
    !-- include all "standard" Coherence POF user types --
    includecoherence-pof-config.xml/include

    !-- include all application POF user types --
    user-type
    type-id1002/type-id
    class-nameClass1.Package1/class-name
    /user-type
    user-type
    type-id1003/type-id
    class-nameClass2.Package1/class-name
    /user-type

    /user-type-list
    /pof-config

    Anybody has an idea why this fails? Thanks

    DB:2.70:Question About Putting Multiple Pof Types In One Cache dj

    Hi,

    Is it possible that you forgot to recycle all of the processes?

    Regards,

    Harv

  • RELEVANCY SCORE 2.69

    DB:2.69:Pofextractor Encoder Error When Moving From Distributed To Replicated 7d


    Hi All,

    I have a piece of code which initially used the Distributed Caching scheme with the following configuration:

    coherence-cache-config.xml
    =================

    cache-mapping
    cache-name*/cache-name
    scheme-nameexample-distributed/scheme-name
    /cache-mapping

    distributed-scheme
    scheme-nameexample-distributed/scheme-name
    service-nameDistributedCache/service-name
    serializer
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    init-params
    init-param
    param-typestring/param-type
    param-valuetokens-pof-config.xml/param-value
    /init-param
    /init-params
    /serializer
    backing-map-scheme
    local-scheme
    scheme-refexample-binary-backing-map/scheme-ref
    expiry-delay100000s/expiry-delay
    /local-scheme
    /backing-map-scheme
    autostarttrue/autostart
    thread-count2500/thread-count

    "tokens-pof-config.xml"
    ===============

    ?xml version="1.0"?
    pof-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns="http://xmlns.oracle.com/coherence/coherence-pof-config"
    xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-pof-config coherence-pof-config.xsd"
    user-type-list
    !-- coherence POF user types --
    includecoherence-pof-config.xml/include
    !-- com.tangosol.examples package --
    user-type
    type-id1001/type-id
    class-nameoracle.communications.activation.asap.ace.Token/class-name
    /user-type
    user-type
    type-id1002/type-id
    class-nameoracle.communications.activation.asap.ace.Asdl/class-name
    /user-type
    /user-type-list
    allow-interfacestrue/allow-interfaces
    allow-subclassestrue/allow-subclasses
    /pof-config

    And everything works fine.

    Then I changed the caching sheme to a replicated cache instead of a distributed caching scheme by changing the following configuration

    cache-mapping
    cache-name*/cache-name
    scheme-nameexample-replicated/scheme-name
    /cache-mapping

    replicated-scheme
    scheme-nameexample-replicated/scheme-name
    service-nameReplicatedCache/service-name
    serializer
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    init-params
    init-param
    param-typestring/param-type
    param-valuetokens-pof-config.xml/param-value
    /init-param
    /init-params
    /serializer
    backing-map-scheme
    local-scheme
    scheme-refunlimited-backing-map/scheme-ref
    /local-scheme
    /backing-map-scheme
    thread-count2500/thread-count
    autostarttrue/autostart
    /replicated-scheme

    Once I do this and run my piece of code I start seeing the following error message. Any ideas what i did wrong ?

    java.lang.UnsupportedOperationException: PofExtractor must be used with POF-encoded Binary entries; the Map Entry is not a BinaryEntry.

    Please help me :)

    DB:2.69:Pofextractor Encoder Error When Moving From Distributed To Replicated 7d

    807103 wrote:
    Thanks a ton Robert.

    A follow up question, will it be fair to assume hence that a partitioned or distributed cache would be more performant for POF objects as compared to a replicated cache due to full serialization/de-serialization ?

    Thanks,

    AnkitNot really.

    Provided that you have the same set of objects in both the replicated and the distributed cache, then there are a number of factors which are better for replicated cache and some which are better for distributed cache if only query performance is considered:

    Replicated cache has all the data locally, hence there is no cost for retrieving data across the network.

    If a query can executed completely based on index data, then it is usually faster to query it locally even if set operations for logical expressions in the filter may be more costly, because the number of index operations are the same, you just need to work with larger sets. Again, due to the data being available locally.

    If you need to iterate through the entry-set to evaluate the filter one-by one, then distributed cache can do it in parallel on many nodes. On the other hand, since the data is available in Java object form, you usually do not have to pay the deserialization cost, and hence evaluating a ValueExtractor in a replicated cache is faster than evaluating POF-based extractors which have to read through the POF binary until they arrive to the value to extract (it helps to put these to the front by giving them a low property id). Even though you do not generate that much garbage and can break off when you reached the field you want, POF extraction does take some time.

    So in general, if you can fit the data into replicated cache with the same indexes, then replicated cache will likely be faster for query performance.

    The advantages of the distributed cache above replicated cache starts to become interesting when you want to scale out either your data or your nodes or both:

    Data-set size cannot scale with a replicated cache for obvious reasons (it is limited by the smallest cluster node). In a distributed cache you can scale out with the number of nodes (you can have hundreds of nodes with the default configuration in Coherence and with tuning you can grow a bit more).

    Cost of updating the data is linear with the number of nodes in a replicated cache and constant in a partitioned (distributed) cache.

    It is an interesting fact, though, that most filter-based operations do not scale with increasing the number of nodes, as each node has to execute the operation (unless it is partition-filtered).

    Best regards,

    Robert

  • RELEVANCY SCORE 2.69

    DB:2.69:Cqc Not Working For 3.5.1? jj


    We use CQC with AlwaysFilter and a map listener, which works for 3.3.1, 3.4, 3.4.1, 3.4.2 and 3.5.

    Right after we replace 3.5 coherence jar with 3.5.1 version, the map listener no longer receive any map event. Put back the 3.5 coherence jar and it work again.

    The cache is using Pof serialization and the object insert into this cache is byte[] type.

    Anyone observe similiar problem?

    DB:2.69:Cqc Not Working For 3.5.1? jj

    Thank you rhlee,

    The fix does work.

    Thanks again and regards
    /Anand

  • RELEVANCY SCORE 2.67

    DB:2.67:Java.Lang.Illegalstateexception: Missing Pof Configuration z9


    Hi All,

    I am using Coherence 3.4.2 and trying to execute a sample wiht POF and getting the below exception

    .(Wrapped: error configuring class "com.tangosol.io.pof.ConfigurablePofContext") java.lang.IllegalStateException: Missing POF configuration (Config=person-pof-config.xml)

    my cache config is

    ?xml version="1.0"?

    !DOCTYPE cache-config SYSTEM "cache-config.dtd"

    cache-config
    caching-scheme-mapping
    cache-mapping
    cache-name*/cache-name
    scheme-nameExamplesPartitionedPofScheme/scheme-name
    /cache-mapping
    /caching-scheme-mapping

    caching-schemes
    distributed-scheme
    scheme-nameExamplesPartitionedPofScheme/scheme-name
    service-namePartitionedPofCache/service-name
    serializer
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    init-params
    init-param
    param-typeString/param-type
    param-valueperson-pof-config.xml/param-value
    /init-param
    /init-params
    /serializer
    backing-map-scheme
    local-scheme
    !-- each node will be limited to 250MB --
    high-units250M/high-units
    unit-calculatorbinary/unit-calculator
    /local-scheme
    /backing-map-scheme
    autostarttrue/autostart
    /distributed-scheme
    /caching-schemes
    /cache-config

    How do i specify the POF config file using a command line option ??

    Please clarify ASAP.

    Regards
    SRINI

    DB:2.67:Java.Lang.Illegalstateexception: Missing Pof Configuration z9

    For more information:
    http://coherence.oracle.com/display/COH34UG/POF+User+Type+Configuration+Elements

  • RELEVANCY SCORE 2.66

    DB:2.66:Messaging Pattern 2.7.1 Removesubscriberfrommessageprocessor Class Missing s9


    When testing incubator Messaging 2.7.1 under Coherence *3.5.3* the server startup fails due to a missing class.

    In coherence-3.5-messagingpattern-2.7.1.18316.jar the POF configuration file coherence-messagingpattern-pof-config.xml contains an entry for the following type:

    user-type
    type-id12037/type-id
    class-namecom.oracle.coherence.patterns.messaging.entryprocessors.RemoveSubscriberFromMessageProcessor/class-name
    /user-typeThis class is absent from the com.oracle.coherence.patterns.messaging.entryprocessors package though it does exist in version 2.7.0.

    This causes the extensible namespace loader to produce the following exception when loading the messaging POF config at startup:

    Exception in thread "main" (Wrapped: Failed to start Service "DistributedCacheForSequenceGenerators" (ServiceState=SERVICE_STOPPED))...
    (Wrapped: Unable to load class for user type (Config=pof-config.xml, Type-Id=12037, Class-
    Name=com.oracle.coherence.patterns.messaging.entryprocessors.RemoveSubscriberFromMessageProcessor))
    (Wrapped) java.lang.ClassNotFoundException: com.oracle.coherence.patterns.messaging.entryprocessors.RemoveSubscriberFromMessageProcessor- Is this class still required but missing from 2.7.1?
    - Or was this class removed intentionally from the JAR and the POF file is now incorrect?

    I assume push replication 3.0.1 can't be used since it depends on messaging 2.7.1. Perhaps I'm making a configuration error. However there was no problem running the previous version.

    Kind regards
    phil

    DB:2.66:Messaging Pattern 2.7.1 Removesubscriberfrommessageprocessor Class Missing s9

    Thank you for the advice Neville. The startup works correctly after removing the POF entry for the missing class.

    It is suprising Oracle's tests did not discover this. I imagine the Messaging and PRP example projects would fail immediately.

    Can such a step be added to the release process in future? (The testing should include 3.5 and 3.6 versions)

    Can I also suggest that the WIKI be updated with a note to tell developers they need to do this. Otherwise all users may have to go through this same debug process.

    Kind regards,
    phil

    Edited by: phil wheeler on Sep 27, 2010 1:34 PM

  • RELEVANCY SCORE 2.66

    DB:2.66:Handling Enums In The Equals Filter Using Pof k9


    Can someone point to how to handle enums in a grid that is setup to use POF serialization (3.4) and equals filter.

    When objects are serialized, we are serializing Enums as integers. However, When trying to use an Equals filter to search for objects with by an enum value, we are not getting a result back. The only way we found so far, is to create a getter that returns an integer value of an enum. Is there another way?

    DB:2.66:Handling Enums In The Equals Filter Using Pof k9

    Timur wrote:
    Can someone point to how to handle enums in a grid that is setup to use POF serialization (3.4) and equals filter.

    When objects are serialized, we are serializing Enums as integers. However, When trying to use an Equals filter to search for objects with by an enum value, we are not getting a result back. The only way we found so far, is to create a getter that returns an integer value of an enum. Is there another way?Hi Timur,

    the EqualsFilter invokes either a getter method or an extractor.

    If you have a getter method which returns the ordinal as an int value, then you can use the EqualsFilter with the ordinal of the enum to search for passed in to its parameter.

    However, if you have a getter returning the enum value, I believe, it fails by default because the EqualsFilter containing the enum value to search for can not be serialized. However, you can still make that work, but in this case you have to create a custom serializer for the enum type (or the java.lang.Enum class, if allow-subclasses is enabled in the POF configuration). This will allow Coherence to serialize and deserialize the enum value contained in the EqualsFilter over POF, which otherwise may not be supported by the default configuration.

    Once the enum type can be serialized, the EqualsFilter with the getter method returning an enum value will work fine, too.

    Best regards,

    Robert

  • RELEVANCY SCORE 2.66

    DB:2.66:C++ Pofwriter::Writeremainder - Mandatory? jf


    The C++ docs (Release 3.7.1.0) for PofWriter::writeRemainder state: -

    "As part of writing out a user type, this method *must* be called by the PofSerializer that is writing out the user type, or the POF stream will be corrupted."

    In the example serialization approaches In Oracle Coherence Client Guide Release 3.7...

    Example 10-2 Managed Class using Serialization does NOT call this method.

    Example 10-7 An External Class Responsible for Serialization DOES call this method.

    Can someone please explain why?

    Adding the writeRemainder call to ManagedT serialization causes an EOFException. Leaving out the call (as in the examples) makes serialization succeed. I'm slightly confused.

    Edited by: DonLonDon on Feb 14, 2012 4:38 PM

    DB:2.66:C++ Pofwriter::Writeremainder - Mandatory? jf

    DonLonDon wrote:
    The C++ docs (Release 3.7.1.0) for PofWriter::writeRemainder state: -

    "As part of writing out a user type, this method *must* be called by the PofSerializer that is writing out the user type, or the POF stream will be corrupted."

    In the example serialization approaches In Oracle Coherence Client Guide Release 3.7...

    Example 10-2 Managed Class using Serialization does NOT call this method.

    Example 10-7 An External Class Responsible for Serialization DOES call this method.

    Can someone please explain why?

    Adding the writeRemainder call to ManagedT serialization causes an EOFException. Leaving out the call (as in the examples) makes serialization succeed. I'm slightly confused.

    Edited by: DonLonDon on Feb 14, 2012 4:38 PMHi DonLonDon,

    The external serializer class does indeed need to call the writeRemainder/readRemainder.

    The managed class does not need to call writeRemainder/readRemainder as the caller of the readExternal/writeExternal methods would call it.

    This allows you to override the readExternal/writeExternal methods and delegate superclass attribute (de)serialization to the superclass readExternal/writeExternal implementation without it calling readRemainder/writeRemainder too early before the subclass had a chance to (de)serialize subsequent attributes.

    Best regards,

    Robert

  • RELEVANCY SCORE 2.66

    DB:2.66:Direct Backingmap Update Using Pof? ca


    Using the backingmap I can get hold of from a BinaryEntry I can get hold of internal keys and values and using converters I can turn them into de-serialized Java objects.

    Is it possible to somehow use POF extractors / updaters to update the values in the backingmap instead of replacing them and in this way reduce the amount of serialization / de-serialization and if so how could I go about it?

    /Magnus

    DB:2.66:Direct Backingmap Update Using Pof? ca

    After a bit of digging in our code I can see where I was getting confused about triggers.

    I suspect triggers do fire when you use an EntryProcessor and update the backing map.

    Where triggers do not fire is in our case we have an normal invocation service invocable that we use to restore data from disk backups and this writes directly to the various backing maps. In this case triggers do not fire but cache stores do.

    JK

  • RELEVANCY SCORE 2.65

    DB:2.65:Re: Pof Question 8m


    Yes that possibility occured to me but I assumed (perhaps incorrectly) that this class were tied to ExternalizableLite serialization...

    Any experiements with this that you can share with the community is varmly apreciated :-)

    /Magnus

    DB:2.65:Re: Pof Question 8m

    robvarga wrote:Is there no means to "dry-run" POF serialization/de-serialzation (like you can do with Javas normal serialization by seting up an ObjectStream to work against a byte array and see how much space is consumed) without having to set up a cache etc - that would be very handy both to make module-tests (that test that you can get back the same object after serialization / de-serialization) and to measure space usage!

    Hi Magnus,

    certainly it is possible, I will try to put together an example this evening, but can't give you one off my head at the moment. Robert,

    If you wrote something for this, can you please share it with me? I am trying to serialize and deserialize objects using POF in my junit without writing it out to a cache.

    Regards,
    Sairam

    Edited by: SairamR on Dec 11, 2008 4:29 PM

    Edited by: SairamR on Dec 11, 2008 4:30 PM

  • RELEVANCY SCORE 2.64

    DB:2.64:Biztalk 2010:Type Http://Schemas.Microsoft.Com/2003/10/Serialization/Arrays:Arrayofstring Is Not Declared. jc


    Hi Experts,
    I am getting one error message ,
    After adding the WCF service using Add generated Item wizad, while i try to open the xsd I am getting following error message
    . Type 'http://schemas.microsoft.com/2003/10/Serialization/Arrays:
    ArrayOfKeyValueOfstringArrayOfTrackingResultmFAkxlpY' is not declared.
    2. Type 'http://schemas.microsoft.com/2003/10/Serialization/Arrays:ArrayOfstring' is not declared.
    So i got one element is has type of
    xmlns:q2=http://schemas.microsoft.com/2003/10/Serialization/Arrays minOccurs=0 name=TrackingResults
    nillable=true type=q2:ArrayOfKeyValueOfstringArrayOfTrackingResultmFAkxlpY /
    So what can I do to overcome this.
    Can experts gives some tips on this.
    Thanks

    DB:2.64:Biztalk 2010:Type Http://Schemas.Microsoft.Com/2003/10/Serialization/Arrays:Arrayofstring Is Not Declared. jc

    Hi Experts,
    I have solved this issue.
    Actually After adding the WCF service another xsd was there which contains the types declaration ,So what I did is go to my service schema and Imports the following schemas
    Service_1_0_schemas_microsoft_com_2003_10_Serialization_Arrays.xsd

    Finally I resolved.
    Thanks

  • RELEVANCY SCORE 2.63

    DB:2.63:Mixing Pof And Serializable In Distributed Schemes ss


    Hi,

    Is it possible to have 2 named caches, both using distributed schemes (2 distributed services), use 2 different serialization strategies, one with POF and other Java Serialization? I defined 2 distributed schemes for the two named caches but I get errors that the objects I put into the Java serializable named cache is an unknown user type. I do not want to implement PortableObject on this class.

    If I start up the grid and client with the system properties

    -Dtangosol.pof.config=knowmed-pof-config.xml
    -Dtangosol.pof.enabled=true

    both the services get the com.tangosol.io.pof.ConfigurablePofContext as their Serializer class. If I don't use these system properties at start up, I am not able to start the service with POF serializer. I get errors on both client and the grid

    Grid:

    2008-12-05 11:49:17.897/38.907 Oracle Coherence GE 3.4/405p1 Error (thread=DistributedCache:DistributedCachePOF, member=1): An exception (java.lang.IllegalArgumentException) occurred reading Message MemberConfigUpdate Type=-3 for Service=DistributedCache{Name=DistributedCachePOF, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=257, BackupCount=1, AssignedPartitions=257, BackupPartitions=0}
    2008-12-05 11:49:17.897/38.907 Oracle Coherence GE 3.4/405p1 Error (thread=DistributedCache:DistributedCachePOF, member=1): Terminating DistributedCache due to unhandled exception: java.lang.IllegalArgumentException
    2008-12-05 11:49:17.897/38.907 Oracle Coherence GE 3.4/405p1 Error (thread=DistributedCache:DistributedCachePOF, member=1):
    java.lang.IllegalArgumentException: unknown user type: 6
    at com.tangosol.io.pof.ConfigurablePofContext.getPofSerializer(ConfigurablePofContext.java:373)
    at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3281)
    at com.tangosol.io.pof.PofBufferReader.readObject(PofBufferReader.java:2599)
    at com.tangosol.io.pof.ConfigurablePofContext.deserialize(ConfigurablePofContext.java:348)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.readObject(Service.CDB:4)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid$ServiceConfigMap.readObject(Grid.CDB:1)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.DistributedCache$ServiceConfigMap.readObject(DistributedCache.CDB:23)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid$MemberConfigUpdate.read(Grid.CDB:3)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:117)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.DistributedCache.onNotify(DistributedCache.CDB:3)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:37)
    at java.lang.Thread.run(Thread.java:595)

    Client:

    2008-12-05 11:49:20,741 ERROR [Log4j] 2008-12-05 11:49:20.663/25.657 Oracle Coherence GE 3.4/405p1 Error (thread=DistributedCache:DistributedCachePOF, member=2): An exception (java.io.IOException) occurred reading Message MemberConfigUpdate Type=-3 for Service=DistributedCache{Name=DistributedCachePOF, State=(SERVICE_STARTED), LocalStorage=disabled}
    2008-12-05 11:49:20,741 ERROR [Log4j] 2008-12-05 11:49:20.663/25.657 Oracle Coherence GE 3.4/405p1 Error (thread=DistributedCache:DistributedCachePOF, member=2): Terminating DistributedCache due to unhandled exception: java.io.IOException
    2008-12-05 11:49:20,757 ERROR [Log4j] 2008-12-05 11:49:20.663/25.657 Oracle Coherence GE 3.4/405p1 Error (thread=DistributedCache:DistributedCachePOF, member=2):
    java.io.IOException: unsupported type / corrupted stream: 78
    at com.tangosol.util.ExternalizableHelper.readObjectInternal(ExternalizableHelper.java:2222)
    at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2209)
    at com.tangosol.io.DefaultSerializer.deserialize(DefaultSerializer.java:60)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.readObject(Service.CDB:4)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid$ServiceConfigMap.readObject(Grid.CDB:1)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.DistributedCache$ServiceConfigMap.readObject(DistributedCache.CDB:23)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid$MemberConfigUpdate.read(Grid.CDB:3)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:117)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.DistributedCache.onNotify(DistributedCache.CDB:3)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:37)
    at java.lang.Thread.run(Thread.java:595)

    Pls advise

    Thanks
    Sairam

    DB:2.63:Mixing Pof And Serializable In Distributed Schemes ss

    I found the problem. I was overriding the DistributedScheme with a tangosol-override.xml file in my project which was causing some conflict with the serializer setting for each distributed service. I removed the override and now I am able to have different serialization strategies for different services.

    Regards,
    Sairam

  • RELEVANCY SCORE 2.61

    DB:2.61:Pof Confusion And Need Advice xs


    here is my situation. we are getting data from some backend library and it returns object which uses java serialization. and we do perform certain operation on this object and save that along with newly created object.
    now we need to use POF. i cab easily write POF serializer for new object. but what about that old library object that it has inside..

    these are 2 solutions we are thinking.

    old library object - oldObj
    new object - newObj

    solution 1 - when we get oldObj make a copy of this in some thirdObj which can be POF enabled (because it is not included in library) and then use it
    issue with this is -- converting effort will take some time.

    solution 2 - use different cache to store this oldObj in "cache1" and newObj will be in different cache "cache2". and then cach1 will use java serialization and cache2 will uses POF.
    cache1 to cache2 communication is necessary in this case, and i dont know how long is that ?

    i can not modify the library since it is used by many our other applications.
    what is better approach. we need faster performance. thats the main goal.

    does any one has any advise what should we do in this case.

    Thanks

    DB:2.61:Pof Confusion And Need Advice xs

    Hi,

    as long as you can cheaply recreate the library object from scratch (this may not be feasible depending on the library object class), then you can just write a PofSerializer for the library object class.

    Best regards,

    Robert

  • RELEVANCY SCORE 2.61

    DB:2.61:Re: Persistent Caches x1


    Hi Luke,

    I also have a couple of comments and suggestions regarding your serialization code:

    1. You can pass null to the writeRemainder method -- there is no need to create a Binary instance.

    2. Whenever you read collection from the POF stream you should specify collection template as the second argument of the PofReader.readCollection method:

    rule.setChildren((VectorRule) reader.readCollection(4, new VectorRule()));

    Unlike many other serialization mechanisms, POF does not guarantee that the type of the collection read from the stream will be exactly the same as the type that was written. This has to do with the fact that POF is designed to be multi-platform, so it simply writes out collection size and elements into the stream, without any platform-specific collection type information. It is up to the serialization code to specify an empty template collection that the elements should be read into. If you don't specify it (by passing null instead), POF reader will create an instance of a default collection type, which is platform-dependent, is not guaranteed to remain the same across Coherence releases, and is definitely not a VectorRule ;-)

    As you can probably guess, this would lead to cast exception during deserialization, so if this code has worked so far, I am willing to bet that the children property was set to null in all the test cases you ran.

    HTH,
    Aleks

    DB:2.61:Re: Persistent Caches x1

    I implemented it as a static in the CacheStore, yes. Because I use get() rather than put(), I would call this for every member that requires a warm cache. getAll() would be a performance improvement of course.

  • RELEVANCY SCORE 2.61

    DB:2.61:Simplepofpath Not Serializable 8c


    I am running into serialization issues when using POFExtractors and filteres with Coherence 3.7.1.6. I am getting the following error when trying to do a filtered search on the keys:


    Caused by: java.io.NotSerializableException: com.tangosol.io.pof.reflect.SimplePofPath
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180)
    at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1528)
    at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1493)
    at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1416)
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
    at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:346)
    .... In my POFConfig.xml I have the following:


    user-type
    type-id1002/type-id
    class-namecom.myClass.MyPOFObject/class-name
    serializer
    class-namecom.tangosol.io.pof.PofAnnotationSerializer/class-name
    init-params
    init-param
    param-typeint/param-type
    param-value{type-id}/param-value
    /init-param
    init-param
    param-typeclass/param-type
    param-value{class}/param-value
    /init-param
    init-param
    param-typeboolean/param-type
    param-valuetrue/param-value
    /init-param
    /init-params
    /serializer
    /user-type MyPOFObject is using the POF annotations:


    @Portable
    public class MyPOFObject {

    @PortableProperty(FIELD1)
    private String field1;
    @PortableProperty(FIELD2)
    private String field2;
    ....
    } For the Extractor and Filter I have the following:

    NamedCache myCache = CacheFactory.getCache("myCache");
    ValueExtractor extractor = new PofExtractor(String.class, new SimplePofPath(MyPOFObject.FIELD1), PofExtractor.KEY);
    Filter equalsFilter = new EqualsFilter(extractor, "myData");
    SetMyPOFObject filteredKeys = myCache.keySet(equalsFilter);
    ... Any help/ideas would be appreciated.

    Thanks

    DB:2.61:Simplepofpath Not Serializable 8c

    Yep, I was missing the following in the cache config file:

    serializer
    instance
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    init-params
    init-param
    param-typeString/param-type
    param-valuePOFConfig.xml/param-value
    /init-param
    /init-params
    /instance
    /serializerI also added the following JVM option:

    -Dtangosol.pof.enabled=true

  • RELEVANCY SCORE 2.61

    DB:2.61:Re: Unable To Load Custom Pof Configuration aj


    HiWhen I remove the POFConfig from the folder, it is showing FileNOTFound error. If I keep the file the error is unable to load the POF file.Hope this explanation helps.

    DB:2.61:Re: Unable To Load Custom Pof Configuration aj

    It seems that all the discussion threads under the following category have been deleted.All Placeshttps://community.oracle.com/places?customTheme=otn Fusion Middlewarehttps://community.oracle.com/community/fusion_middleware Coherencehttps://community.oracle.com/community/fusion_middleware/coherence Coherence Supporthttps://community.oracle.com/community/fusion_middleware/coherence/coherence_support Discussionshttps://community.oracle.com/community/fusion_middleware/coherence/coherence_support/content?filterID=contentstatus%5bpublished%5dobjecttypeobjecttype%5bthread%5dcustomTheme=otnI get the following error message when trying to access the above forum.Not FoundThe item does not exist. It may have been deleted.Can you please tell me if there is a new discussion forum for talking about the Oracle Coherence product?Sumanth Sridhar

  • RELEVANCY SCORE 2.60

    DB:2.60:Pof Or Externalizablelite j8


    Hi,

    What is the best practice for serialization.

    I have currently implemented ExternalizableLite for most of my classes but I'd like to have the advantages of xmlbean but without extending xml bean.

    I found this thread from 2006 and it mentions portable object format as the best of both worlds, but it also mentions that it doesn't work with distributed caches, at least in 3.2, has this been fixed in 3.3?

    ExternalizableLite using constructor class id optimization?

    Also when using ExternalizableLite, is it possible to serialize null values? Or do I have to guarantee that all fields always has a valid value?

    Cheers,
    Henric

    DB:2.60:Pof Or Externalizablelite j8

    Hi Henric,

    for ExternalizableLite members to be written out in another ExternalizableLite, you can just use writeObject, it will recognize ExternalizableLite implementors, and it will also optimally encode null values.

    But if the class of the written object is always the same (meaning that you can construct it without any info read from the stream earlier) then the most optimal is just to write a boolean to indicate null or not null, then simply delegate to the writeExternal of the member ExternalizableLite object. In this case the class name is not written out in either the null or the not null case.

    Best regards,

    Robert

  • RELEVANCY SCORE 2.60

    DB:2.60:Default To Java Serialization In Case Pof Serialization Not Defined ks


    Is this possible to do, i.e. essentially if for a certain class Pof serialization is not defined, use the Java serialization instead?

    Or to turn it around, is it possible to define pof serialization only for certain classes in a distributed cache and use Java serialization for the rest?

    DB:2.60:Default To Java Serialization In Case Pof Serialization Not Defined ks

    Hi,

    the problem for this is that Java serialization is not aware of POF (or for that matter even ExternalizableLite), so if you have a Java-serialized class which has a member which is supposed to be POF-serializable, it in fact will not be serialized with POF, because Java serialization will not delegate to POF.

    So it is very hard to mix the two together. You can do it for top-level objects by providing a special PofSerializer instance for the non-POF class which serializes to byte array and you write the byte array as a POF attribute, but it is not possible for POF-aware objects contained within a non-POF aware object to be POF serialized.

    Also, if you attempt to do this, then you can kiss goodbye to platform independence. You must use Java on both ends and have all the libraries which the classes used in the state want to pull in.

    Best regards,

    Robert

  • RELEVANCY SCORE 2.60

    DB:2.60:Pof Exception , Java.Io.Ioexception: Previous Property Index=4... 9f


    I am getting the following error when the following application code runs:

    public void testPOF() {

    Token T1 = new Token(1,1,"Toronto",3,5);
    NamedCache aceCache = CacheFactory.getCache("ACE");
    aceCache.put("TokenTest1", T1);

    Token T2 = (Token) aceCache.get("AnkitAsthana");
    if (T1.getNeID().equals(T1.getNeID())) {
    System.out.println(" Works and equal ");
    }
    System.out.println("===============\n");
    }

    As you might already guess Token is a POF object , its artifacts are attached below. Coherence-cache-server seemed to startup fine.

    Oct 13, 2011 7:56:47 PM PDT Warning EJB BEA-010065 MessageDrivenBean threw an Exception in onMessage(). The exception was:
    (Wrapped) java.io.IOException: previous property index=4, requested property index=1 while writing user type 1001.
    (Wrapped) java.io.IOException: previous property index=4, requested property index=1 while writing user type 1001
    at com.tangosol.util.ExternalizableHelper.toBinary(ExternalizableHelper.java:215)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ConverterValueToBinary.convert(PartitionedCache.CDB:3)
    at com.tangosol.util.ConverterCollections$ConverterMap.put(ConverterCollections.java:1578)

    Plz help I have no idea what the issue is?
    ======================================================================================================================
    tokens-pof-config.xml
    ======================================================================================================================
    ?xml version="1.0"?
    pof-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns="http://xmlns.oracle.com/coherence/coherence-pof-config"
    xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-pof-config coherence-pof-config.xsd"

    user-type-list
    !-- coherence POF user types --
    includecoherence-pof-config.xml/include

    !-- com.tangosol.examples package --
    user-type
    type-id1001/type-id
    class-nameToken/class-name
    /user-type
    /user-type-list
    allow-interfacestrue/allow-interfaces
    allow-subclassestrue/allow-subclasses
    /pof-config

    ======================================================================================================================
    Token.java
    ======================================================================================================================

    import java.io.IOException;
    import java.io.Serializable;
    import com.tangosol.io.pof.PortableObject;
    import com.tangosol.io.pof.PofReader;
    import com.tangosol.io.pof.PofWriter;
    import java.sql.*;
    import java.util.Enumeration;

    public class Token implements PortableObject {

    /**
    * 1 - Unassigned
    * 2 - Available
    * 3 - Reserved
    * 4 - defunct
    */
    private int state;

    /**
    * NE-ID(s)
    */
    private String neID;

    /**
    * Number of tokens currently Active
    */
    private int tokensCurrentlyActive;

    /**
    * Max - Number of tokens available
    */
    private int maxTokensAvailable;

    /**
    * unqiue Token ID, used to identify Tokens
    */
    private int tokenID;

    /**
    *
    * POF index for data members
    */
    public static final int TOKENID = 0;
    public static final int STATE = 1;
    public static final int NEID = 2;
    public static final int CURTOKEN = 3;
    public static final int MAXTOKEN = 4;

    /**
    *
    * @param state
    */
    public void setState(int state) {
    this.state = state;
    }

    /**
    *
    * @return
    */
    public int getState() {
    return state;
    }

    /**
    *
    * @param neID
    */
    public void setNeID(String neID) {
    this.neID = neID;
    }

    /**
    *
    * @return
    */
    public String getNeID() {
    return neID;
    }

    /**
    *
    * @param tokensCurrentlyActive
    */
    public void setTokensCurrentlyActive(int tokensCurrentlyActive) {
    this.tokensCurrentlyActive = tokensCurrentlyActive;
    }

    /**
    *
    * @return
    */
    public int getTokensCurrentlyActive() {
    return tokensCurrentlyActive;
    }

    /**
    *
    * @param maxTokensAvailable
    */
    public void setMaxTokensAvailable(int maxTokensAvailable) {
    this.maxTokensAvailable = maxTokensAvailable;
    }

    /**
    *
    * @return
    */
    public int getMaxTokensAvailable() {
    return maxTokensAvailable;
    }

    /**
    *
    * @param tokenID
    */
    public void setTokenID(int tokenID) {
    this.tokenID = tokenID;
    }

    /**
    *
    * @return
    */
    public int getTokenID() {
    return tokenID;
    }

    public Token(int state, int tokenID, String neID, int maxTokensAvailable, int tokensCurrentlyActive){
    this.state = state;
    this.tokenID = tokenID;
    this.neID = "Toronto";
    this.maxTokensAvailable = maxTokensAvailable;
    this.tokensCurrentlyActive = tokensCurrentlyActive;
    }

    // ----- PortableObject interface ---------------------------------------

    /**
    * {@inheritDoc}
    */
    public void readExternal(PofReader reader)
    throws IOException
    {
    tokenID = Integer.parseInt(reader.readString(TOKENID));
    neID = reader.readString(NEID);
    tokensCurrentlyActive = Integer.parseInt(reader.readString(CURTOKEN));
    maxTokensAvailable = Integer.parseInt(reader.readString(MAXTOKEN));
    state = Integer.parseInt(reader.readString(STATE));
    }

    /**
    * {@inheritDoc}
    */
    public void writeExternal(PofWriter writer)
    throws IOException
    {
    writer.writeString(TOKENID,Integer.toString(tokenID));
    writer.writeString(NEID,neID);
    writer.writeString(CURTOKEN,Integer.toString(CURTOKEN));
    writer.writeString(MAXTOKEN,Integer.toString(MAXTOKEN));
    writer.writeString(STATE,Integer.toString(STATE));
    }
    }

    ======================================================================================================================
    coherence-cache-config.xml
    ======================================================================================================================
    distributed-scheme
    scheme-nameexample-distributed/scheme-name
    service-nameDistributedCache/service-name
    serializer
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    init-params
    init-param
    param-typestring/param-type
    param-valuetokens-pof-config.xml/param-value
    /init-param
    /init-params
    /serializer

    backing-map-scheme
    local-scheme
    scheme-refexample-binary-backing-map/scheme-ref
    /local-scheme
    /backing-map-scheme

    autostarttrue/autostart
    /distributed-scheme

    DB:2.60:Pof Exception , Java.Io.Ioexception: Previous Property Index=4... 9f

    Ankit

    When using POF you need to have a default constructor because Coherence needs to create a new instance of the class. The only thing the POF serializer knows is the name of the class to create so it basically does something like Class.forName("class-name") to get the class then calls class.newInstance() on that class to create a new instance - i.e. calls the default constructor. There is no way the default POF serializer can know the arguments to pass to any other constructors that the class might have.

    If you do not want to have default constructors in you class or you are working with classes from a third-party library that you want to serialize then you would need to write external POF serializers http://download.oracle.com/docs/cd/E24290_01/coh.371/e22837/api_pof.htm#BABJDCCC

    JK

  • RELEVANCY SCORE 2.59

    DB:2.59:Xmlmessageformatter Error. Cannot Recognize The Serialization Format. sc


    Hi Guys,

    DB:2.59:Xmlmessageformatter Error. Cannot Recognize The Serialization Format. sc

    Im not exactly sure what i did different bu now it seems to workEven Einstein asked questions...and answered them

  • RELEVANCY SCORE 2.59

    DB:2.59:Sap Webservice Error p3



    I am getting the following error while executing a custom developed web service using BAPI_GETUSERNAME.

    I tested this web service independntly using url,its working.But when I use this in web dynpro application its givving me this error.

    Service call exception; nested exception is: com.sap.engine.services.webservices.jaxrpc.exceptions.XmlMarshalException: XML Serialization Error.

    Please help me on the same.

    Thanks in advance.

    DB:2.59:Sap Webservice Error p3


    HI akshta,

    The error you are getting is due to this reason:

    This error only occurs when some ellement of an XML document is extracted and serialized under the following conditions:

    - the element is namespace qualified,

    - there is a namespace declaration in the scope that declares the namespace prefix of this element,

    - there is another namespace declaration in the scope that declares another namespace prefix with the same namespace URI,

    - this namespace declaration preceeds the namespace declaration of the element in the internal representation of the parser (since the namespace dedclarations are not ordered, this last condition is the parser dependent).

    Note that this problem results in a non-wellformed XML containing duplicate namespace declaration attributes.

    I have a solution for this when it is in XI, but for web dynpro whats your service pack version?

    Thanks,

    Raj.

  • RELEVANCY SCORE 2.59

    DB:2.59:Evolvable Example kz


    Hi All,

    I am looking for a clean exmple of using Evolvable Interface. I have tried but getting unsupportedclass exceptions.

    does this feature only works with POF or Java Serialization as well ?

    DB:2.59:Evolvable Example kz

    Hi,

    it is supported out-of-the-box together only with POF.

    You could write support for it if you write a custom Serializer for the service, which implements the Evolvable lifecycle for your Evolvable objects.

    In practice it is simplest to implement EvolvablePortableObject. I will post an example a bit later.

    Best regards,

    Robert

  • RELEVANCY SCORE 2.59

    DB:2.59:Error While Executing Ecc_Salesordercrtrc Web Service k7



    hello,

    I am getting error while executing the we service ECC_SALESORDERCRTRC using webservice navigator

    XML Serialization Error. Object content does not correspond to Schema restrictions of type [http://sap.com/xi/APPL/SE/Global][BusinessDocumentMessageID.Content].

    can any one tell me why am i getting this error ?

    thanks in advance

    DB:2.59:Error While Executing Ecc_Salesordercrtrc Web Service k7


    Hello.

    I have the same problem.

    What your basis guy had done to solve your problem ?

  • RELEVANCY SCORE 2.59

    DB:2.59:Deplyment Error m1


    I am trying to deploy my sqlj code containing some "sqlj" statements like insert
    etc. While deploying i get the error:

    ORA-29546: badly formed resource: Error java.io.invalid Class Exception :[Ljava.lang.Object Serialization incompatible with externalization}.

    The sqlj class only conains a string and the insert statement.
    null

    DB:2.59:Deplyment Error m1

    BLOCKQUOTEfont size="1" face="Verdana, Arial"quote:/fontHROriginally posted by Vineet Mahajan:
    I am trying to deploy my sqlj code containing some "sqlj" statements like insert
    etc. While deploying i get the error:

    ORA-29546: badly formed resource: Error java.io.invalid Class Exception :[Ljava.lang.Object Serialization incompatible with externalization}.

    The sqlj class only conains a string and the insert statement.HR/BLOCKQUOTE

    ----------
    I am currently having this error message too.
    However when i try running on another PC, the same prog works on that PC. Pls Advise.

    null

  • RELEVANCY SCORE 2.59

    DB:2.59:Near Vs Partitioned Read Performance 9a


    Hi,

    I'm looking to squeeze some extra performance out of Coherence by using a Near+Partitioned cache instead of a Partitioned cache in some places. We do most of our processing using an EntryProcessor with data affinity, so all access to data is local.

    I've seen it written that a Near+Partitioned doesn't use Serialization for storage on the front scheme (so long as it hits of course). This should make it marginally faster than fetching locally from the Partitioned cache, which does store Serialized versions.

    Yet in the [User Guide|http://coherence.oracle.com/display/COH35UG/Types+of+Caches+in+Coherence#TypesofCachesinCoherence-SummaryofCacheTypes] it suggests that performance is "Instant" for both cases.

    Is it worth looking into this change, or is it unlikely to get us any extra performance?

    BTW, we are using POF Serialization.

    DB:2.59:Near Vs Partitioned Read Performance 9a

    Hi,

    Whether near caches will help you improve performance depends on a number of things. You say that most of your processing is using EntryProcessors in which case a near cache will have no affect as an EntryProcessor runs on the node that owns the data and so is always local anyway.

    A near cache only gives an improvement when you do key based "get" operations but not for queries, aggeregators or invocables.

    You also need to be careful if your underlying caches are updated a lot as this can generate a lot of events for your near caches to process.

    You would normally use a near cache on a client application rather than a cluster member.

    JK

  • RELEVANCY SCORE 2.59

    DB:2.59:Using Pof Objects As Keys In C++ 7m


    HI David,

    Hi,
    I am currently writing POF objects to the cache. This works fine when I use keys that are native types (i.e. ints, string etc). I run into problems, however, when I use POF objects as keys.

    For example this works
    map-put(coherence::lang::String::create("Hello"), ManagedPVKey::create(mPVKey));
    As does this
    map-put(coherence::lang::String::create("Hello"), ManagedPVValue::create(mPVValue));

    But this does not

    map-put(ManagedPVKey::create(mPVKey), ManagedPVValue::create(mPVValue));

    I get the following error
    Portable(com.tangosol.util.WrapperException): (Wrapped) unknown user type: 1012
    at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:261)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.DistributedCache$ConverterKeyToBinary.convert(DistributedCache.CDB:19)
    at com.tangosol.util.ConverterCollections$AbstractConverterEntry.getKey(Co

    What do I need to do to get this working?

    I have the POF object registered by using COH_REGISTER_MANAGED_CLASS(1012, PVKey);

    Why is the object recognised as a value but not as a key?

    Thanks in advance for any help

    DB:2.59:Using Pof Objects As Keys In C++ 7m

    Hi Eamon,

    You would put that class on the Java server side and then anywhere that you have com.tangosol.io.pof.ConfigurablePofContext in your cache configuration can be replaced with the custom context class name.

    As I said though, I have only given the class a very quick test and it seems to work around the fact that Coherence tries to deserialize classes just to check for key affinity. There may be situations where there would be problems as any unrecognised class would be deserialized as Binary. Thinking about it overnight (funny what pops into your head when you're trying to sleep) the PassThroughBinarySerializer is not quite right as the Binary returned will not be the same as the original Binary being deserialized but I am sure there are a few tweaks that can fix that if required. For example, one requirement would be if you actually did want key association, your KeyAssociator would be passed the Binary from the PassThroughBinarySerializer which may not be what you really wanted.

    JK

  • RELEVANCY SCORE 2.58

    DB:2.58:Possible To Mix Pof And Non-Pof Objects In The Same Cache? 8c


    Is it possible to mix POF and non-POF objects in the same cache? Or is it all-or-nothing?

    Thanks,
    Andrew

    DB:2.58:Possible To Mix Pof And Non-Pof Objects In The Same Cache? 8c

    robvarga wrote:
    csoto wrote:
    Yes, you can use different serialization strategies for different schemes. But note that the service names must be different, otherwise setting the serializer in one scheme will set it for the service and if all of those schemes are using the same service, it will apply to all. Then make sure to provide an unique service name for each of the distributed schemes.Thanks Cris, I know this.

    Best regards,

    RobertI know that you know, Rob. : )

    Actually I posted that serialization strategies comment because in fact I found very useful your previous post about the service level and I just wanted to complement it a little bit. Thank you!

    Best regards,
    Cris

  • RELEVANCY SCORE 2.58

    DB:2.58:Error Using Custom Value Extractor 3c


    Hi Forum,

    I am practicing to use PropertyExtractor mentioned in OracleCohrence book. It's a simple custom value extractor.

    with following definition.

    public class PropertyExtractor implements ValueExtractor, Serializable {
    private final String propertyName;
    private transient volatile Method readMethod;
    public PropertyExtractor(String propertyName) {
    this.propertyName = propertyName;
    }

    public Object extract(Object o) {
    if (o == null) {
    return null;
    }
    Class targetClass = o.getClass();
    try {
    if (readMethod ==
    null || readMethod.getDeclaringClass() != targetClass) {
    PropertyDescriptor pd =
    new PropertyDescriptor(propertyName, o.getClass());
    readMethod = pd.getReadMethod();
    }
    return readMethod.invoke(o);
    }
    catch (Exception e) {
    throw new RuntimeException(e);
    }
    }
    }

    But I am getting following error while using this.

    java.lang.IllegalArgumentException: unknown user type: org.mylab.util.PropertyExtractor
    at com.tangosol.io.pof.ConfigurablePofContext.getUserTypeIdentifier(ConfigurablePofContext.java:400)
    at com.tangosol.io.pof.ConfigurablePofContext.getUserTypeIdentifier(ConfigurablePofContext.java:389)
    at com.tangosol.io.pof.PofBufferWriter.writeObject(PofBufferWriter.java:1432)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.writeObject(PofBufferWriter.java:2092)
    at com.tangosol.util.filter.ExtractorFilter.writeExternal(ExtractorFilter.java:174)
    at com.tangosol.util.filter.ComparisonFilter.writeExternal(ComparisonFilter.java:238)

    Cache under use refers to pof-config.xml for serializer, however, PropertyExtractor implements Serialzable interface. So I don't think, I need to define serializer?

    DB:2.58:Error Using Custom Value Extractor 3c

    Hi Akilan,

    Because you cache is using POF you must put all the classes you will send over the wire into the pof-config.xml just making them serializable is not good enough. You will also need to make org.mylab.util.PropertyExtractor implement PortableObject, give it a default no-args constructor and remove final from the propertyName field.
    E.G.
    public class PropertyExtractor implements ValueExtractor, PortableObject {

    private String propertyName;
    private transient volatile Method readMethod;

    public PropertyExtractor() {
    }

    public PropertyExtractor(String propertyName) {
    this.propertyName = propertyName;
    }

    public Object extract(Object o) {
    if (o == null) {
    return null;
    }
    Class targetClass = o.getClass();
    try {
    if (readMethod == null || readMethod.getDeclaringClass() != targetClass) {
    PropertyDescriptor pd = new PropertyDescriptor(propertyName, o.getClass());
    readMethod = pd.getReadMethod();
    }
    return readMethod.invoke(o);
    }
    catch (Exception e) {
    throw new RuntimeException(e);
    }
    }

    @Override
    public void readExternal(PofReader pofReader) throws IOException {
    propertyName = pofReader.readString(100);
    }

    @Override
    public void writeExternal(PofWriter pofWriter) throws IOException {
    pofWriter.writeString(100, propertyName);
    }
    }

  • RELEVANCY SCORE 2.58

    DB:2.58:Effect Of External Pofserializer On Java Serialization Outside Of Coherence zp


    I was planning on using the POF to serialize our cached objects, and to create a PofSerializer instance for each
    object to externalize the serialization code outside of the objects themselves.
    I like this approach because then our business objects are not tied to any Coherence APIs.

    However, if I do things this way, then it seems to me that the objects will use regular Java serialization
    when passed via RMI. Is this true? If I want the objects to use the POF format whenever
    serialized, regaradless of whether it is being done by Coherence or not, then do I have to
    abandon this approach? Or should I have the business object (which implements java.io.Serialziable)
    call the PofSerializer instance in its readObject/writeObject methods?

    Thanks,
    Jim Williams

    DB:2.58:Effect Of External Pofserializer On Java Serialization Outside Of Coherence zp

    What is the "license situation" for using Coherence API:s (POF serialization or a "local cache" for instance) on clients not participating directly in a Coherence cluster (lets assume they communicate using RMI with an application server that is a member of a Coherence cluster)? Would it recuire paying a license fee aslo for each client or are some parts of the Coherence code base "free to use"?

    Best Regards
    Magnus

  • RELEVANCY SCORE 2.58

    DB:2.58:Wcf De Serialization Error -- Finding The Erring Field (Urgent) f8


    Hi Folks,
    Need some urgent help. I am writing a WCF client which calls a service. Now, while de serializing the response from the service, the WCF client errs out with 'De serialization error'. In the inner exception, it gives the co ordinates of the erring field
    (e.g.: 5, 113). This de serialization error in my case is caused because of data type mismatch between the data sent by the service, and that in the .Net wcf proxy.
    My question is how do we pin point the exact field based on the co ordinates given in the inner exception, if at all we are using any http debugger like Fiddler. I am finding it very difficult to exactly pinpoint the erring field based on the inner exception.
    Thanks a lot,
    Saurabh

    DB:2.58:Wcf De Serialization Error -- Finding The Erring Field (Urgent) f8

    try to nicely pretty print the soap. assuming the first lines are something like
    envelope
    header
    /header
    body

    then the 5th line is any of the first 3 lines under the body, check for each one of them.
    also you can publish the soap here.http://webservices20.blogspot.com/
    WCF Security, Interoperability And Performance Blog

  • RELEVANCY SCORE 2.58

    DB:2.58:Binary Serialization xk



    I have done the binary serialization of a class and stored in the database. Now I have moved this class into another class library(namespace remains the same), and while I get the database stored values and try to deserialize I am getting the following error.
     
    Unable to load type XXXXXXXXXXXXXX required for deserialization.
     
    Is the Binary serialized data hold the assemly information too? If not then my above scenario should work.
     
    Any help or suggestion would be appreciated.
     
     

  • RELEVANCY SCORE 2.56

    DB:2.56:Re: Java.Io.Notserializableexception When Using Pof Over Extend fk


    But the exception on the proxy node indicates that the connection was established:
    2011-07-25 10:11:50.980/267.258 Oracle Coherence GE 3.7.0.2 Error (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=13): An exception occurred while decoding a Message for Service=Proxy:ExtendTcpProxyService:TcpAcceptor received from: TcpConnection(Id=null, Open=true, LocalAddress=192.168.3.6:9094, RemoteAddress=192.168.3.8:53492): java.lang.ClassCastException: com.tangosol.io.pof.PortableException cannot be cast to com.tangosol.util.UUID
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer$MessageFactory$OpenConnectionRequest.readExternal(Peer.CDB:7)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.peer.acceptor.TcpAcceptor$MessageFactory$OpenConnectionRequest.readExternal(TcpAcceptor.CDB:1)
    at com.tangosol.coherence.component.net.extend.Codec.decode(Codec.CDB:29)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer.decodeMessage(Peer.CDB:25)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer.onNotify(Peer.CDB:54)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
    at java.lang.Thread.run(Thread.java:662)so it must be that the exception on the extend client is simply a misleading exception. I get the same behavior whether the extend client app runs locally to the extend proxy box or not. Also the network config worked fine prior to switching some services to POF. I can telnet to the port and see that extend is answering ok.

    Andrew

    DB:2.56:Re: Java.Io.Notserializableexception When Using Pof Over Extend fk

    How would the config look to put the serializer on the service section instead of under initiator-config? Can you really do that?

    -Andrew

  • RELEVANCY SCORE 2.55

    DB:2.55:Regarding Abstractextractor 9m


    Hi,

    We have a problem with AbstractExtractor. For our requirement previously Jon Hall gave us one Java user defined class (ListEntryExtractor) which implements AbstractExtractor. Now the same thing I am trying in C#. When I include that entry in pof-config.xml, we are getting the below exception while connecting.

    008-01-18 14:23:07.790 Oracle Coherence GE 3.3/387 Error (thread=TcpProcessor, member=1): Error configuring class "com.tangosol.io.pof.ConfigurablePofContext": java.lang.IllegalStateException: Miss
    ng PofSerializer configuration (Config=custom-types-pof-config.xml, Type-Id=1007, Class-Name=com.db.gm.rates.tradecapture.tangosol.ListEntryExtractor)
    at com.tangosol.io.pof.ConfigurablePofContext.report(ConfigurablePofContext.java:1258)
    at com.tangosol.io.pof.ConfigurablePofContext.createPofConfig(ConfigurablePofContext.java:964)
    at com.tangosol.io.pof.ConfigurablePofContext.initialize(ConfigurablePofContext.java:724)
    at com.tangosol.io.pof.ConfigurablePofContext.setContextClassLoader(ConfigurablePofContext.java:320)
    at com.tangosol.coherence.component.comm.ConnectionManager.instantiateSerializer(ConnectionManager.CDB:26)
    at com.tangosol.coherence.component.comm.Connection.instantiateChannel(Connection.CDB:36)
    at com.tangosol.coherence.component.comm.Connection.doOpen(Connection.CDB:14)
    at com.tangosol.coherence.component.comm.Connection.open(Connection.CDB:18)
    at com.tangosol.coherence.component.comm.connectionManager.Acceptor.openConnection(Acceptor.CDB:13)
    at com.tangosol.coherence.component.comm.connectionManager.acceptor.TcpAcceptor.openConnection(TcpAcceptor.CDB:9)
    at com.tangosol.coherence.component.comm.connectionManager.acceptor.TcpAcceptor$TcpProcessor.onAccept(TcpAcceptor.CDB:37)
    at com.tangosol.coherence.component.comm.connectionManager.acceptor.TcpAcceptor$TcpProcessor.onSelect(TcpAcceptor.CDB:21)
    at com.tangosol.coherence.component.comm.connectionManager.acceptor.TcpAcceptor$TcpProcessor.onNotify(TcpAcceptor.CDB:15)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:35)
    at java.lang.Thread.run(Unknown Source).

    If I dont iclude the entry in pof-config.xml, I am getting the exception as
    Unknown user type: FrontOffice3DGDLL.ListEntryExtractor.

    Please find the ListEntryExtractor class in C#. It is similar in java as well.

    using System;
    using System.Collections.Generic;
    using System.Text;
    using System.Collections;

    using Tangosol.Util.Extractor;

    namespace FrontOffice3DGDLL
    {
    class ListEntryExtractor : AbstractExtractor
    {
    //private static long serialVersionUID = 7424814693370905451L;
    private int indexPosition;

    public ListEntryExtractor(int indexPosition)
    {
    this.indexPosition = indexPosition;
    }

    public override object Extract(object target)
    {
    if (!(target is ArrayList))
    return null;

    ArrayList list = (ArrayList)target;
    // Check if invalid index position specified
    if ((indexPosition + 1) list.Count)
    return null;

    return list[indexPosition];
    }

    }
    }

    Could you please advice me on this.

    Regards,
    Satish.

    DB:2.55:Regarding Abstractextractor 9m

    Hi Satish,

    As I mentioned in the previous post, you need to serialize indexPosition field. Basically, your IPortableObject implementation should look like this:

    *.NET:*
    #region IPortableObject Members

    public void ReadExternal(IPofReader reader)
    {
    indexPosition = reader.ReadInt32(1);
    }

    public void WriteExternal(IPofWriter writer)
    {
    writer.WriteInt32(1, indexPosition);
    }

    #endregionJava
    public void readExternal(PofReader reader)
    {
    indexPosition = reader.readInt(1);
    }

    public void writeExternal(PofWriter writer)
    {
    writer.writeInt(1, indexPosition);
    }Without this, you will always be extracting the item at the position 0, as indexPosition will be initialized to 0 during the deserialization within the cluster.

    Regards,

    Aleks

  • RELEVANCY SCORE 2.55

    DB:2.55:Why I Need The Same Pof Conf On Client+Srver If Coherence Is Used As Cache? j7


    Hi,

    I have first defined my own pof-config.xml only on my client.
    And Coherence wrote me a very ackward msg: "StreamCorruptedException unknown user type 6" !!!????

    I have made another try, while defining my own pof-config.xml on my servers too !
    Then, Coherence has worked smoothly, as expected.

    QUESTION: as I use Coherence as a cache, using no index, no EP... I expect Coherence is not going to dig into my byte[] for the 'server' nodes.
    Then, I expect only the clients have to serialize/deserialize, I expect they send the byte[], and the servers store only them, that is, without needing any pof config.

    If it works as I have written, why Coherence is not working ("StreamCorruptedException unknown user type 6"), if I define my own pof-config.xml only on my client ?

    Thanks.

    Regards,
    Dominique

    DB:2.55:Why I Need The Same Pof Conf On Client+Srver If Coherence Is Used As Cache? j7

    Hi Dominique,

    user4947403 wrote:
    Hi Robert,

    robvarga wrote:
    Hi Dominique,

    Coherence does not know what you will do with the data. It has to assume that you may want to send an entry-processor or entry aggregator or query the items with a filter, or you may simply want to get the item from the cache WITHIN the cluster not only via Extend. imho, I feel it as counter-intuitive.
    I have expected something like: as far as no EP is run, no pof config is searched+if no pof config exists in order to deserialize when needed, raise an exception.
    1. There is no such thing as no config. If you don't specify an explicit configuration, default configuration is used, which (depending on whether POF is globally enabled or not) is either a DefaultSerializer, or a ConfigurablePofContext loading stuff from pof-config.xml.

    2. If some data travels across the proxy connection, it has to travel in the serialization format configured for the proxy service. If you put data into a cache, it has to travel on the network in the serialization format configured for the cache service of the cache. Period.
    If this rule was not followed, Coherence would not know what serialization format any piece of data has. Therefore as mentioned below, if the proxy service serialization configuration differs from the cache service serialization configuration, data has to be de- and reserialized on the proxy.

    Again, imagine the case if your logic were followed: Just because you used some service to put data into the cache, Coherence cannot know that you will use the same service to get the data back. If some code inside the cluster tried to deserialize it, it would fail as it has no idea, that it was serialized with the proxy serialization format. It actually doesn't even have an idea that the data came from the proxy, or even that there is a proxy. Also, if you used some other code to put data into the cache which is not coming via the proxy, you would have another piece of data sitting in the cache which was not serialized by the proxy serialization format. If you tried to retrieve that via the TCP*Extend, you could not deserialize it on the client as it is not in the serialization format used by the proxy. Moreover, the client does not even have any chance of even knowing what it was serialized with as the client does not even see the serialization configuration of services inside the cluster.

    Because of this it has to ensure that it can deserialize it with the serializer configured for the cache service within the cluster, so not with the one used for TCP*Extend.Well, in my case, my client is a cluster node+localstorage=false = are the explanations you wrote ok for this case too ?It does not matter how your cluster looks like at the moment when you consider consistency checks. Nothing prevents you to start another cluster node, therefore Coherence cannot be lenient in the service configuration consistency checks just before .

    By the way, if you have only a single storage-disabled cluster node (and that means that no cache server node is running) and you tried to put something into a distributed cache then if the serialization error did not happen because you have correct configuration, then you would have received a Storage not configured error instead, as no cache server node is running to actually store your data.
    So if you did not get any errors, then you were either putting the data into a replicated cache where being storage-disabled is not relevant as that is a distributed cache setting, or you had a cache server node running or you only believed that the node was storage disabled.

    Best regards,

    Robert

  • RELEVANCY SCORE 2.54

    DB:2.54:Serialization In Ioerror ax


    I have learnt that Serialization hierarchy will break if any of the class in the hierarchy overload the default constructor, since while serialization the JVM calls the default constructor. In java 6, the java.io.IOError class extends Error class and by default its object is serializable since Throwable implements Serializable. What my question is, in java.io.IOError the default constructor was overloaded and they declared serialVersionUID for serialization technique as,

    private static final long serialVersionUID = 67100927991680413L;

    What is the use of this, pls explain.......

  • RELEVANCY SCORE 2.54

    DB:2.54:Incorrect Coherence-Rest-Pof-Config.Xml In Coherence-Rest.Jar (Missing Pofserializer Configuration)? jm


    I'm running Coherence 3.7.1.0.0 with REST enabled as per the instruction. I have included coherence-rest-pof-config.xml in my pof config like so:
    includecoherence-pof-config.xml/include
    includecoherence-rest-pof-config.xml/include

    When I start Coherence, I get the following error:
    Caused by: (Wrapped: error creating class "com.tangosol.io.pof.ConfigurablePofContext") java.lang.IllegalStateException: Missing PofSerializer configuration (Config=custom-types-pof-config.xml, Type-Id=801, Class-Name=com.tangosol.coherence.rest.internal.Get)

    The config in coherence-rest-pof-config.xml for com.tangosol.coherence.rest.internal.Get is as follows:
    user-type
    type-id801/type-id
    class-namecom.tangosol.coherence.rest.internal.Get/class-name
    /user-type

    I had a quick look in the coherence-rest.jar, where com.tangosol.coherence.rest.internal.Get is defined. com.tangosol.coherence.rest.internal.Get implements InvocableMap.EntryProcessor, but not PortableObject.Am I missing something? As far as I can see, InvocableMap.EntryProcessor does not implement PortableObject either.Is there any way to fix this?Thanks

    DB:2.54:Incorrect Coherence-Rest-Pof-Config.Xml In Coherence-Rest.Jar (Missing Pofserializer Configuration)? jm

    I'm running Coherence 3.7.1.0.0 with REST enabled as per the instruction. I have included coherence-rest-pof-config.xml in my pof config like so:
    includecoherence-pof-config.xml/include
    includecoherence-rest-pof-config.xml/include

    When I start Coherence, I get the following error:
    Caused by: (Wrapped: error creating class "com.tangosol.io.pof.ConfigurablePofContext") java.lang.IllegalStateException: Missing PofSerializer configuration (Config=custom-types-pof-config.xml, Type-Id=801, Class-Name=com.tangosol.coherence.rest.internal.Get)

    The config in coherence-rest-pof-config.xml for com.tangosol.coherence.rest.internal.Get is as follows:
    user-type
    type-id801/type-id
    class-namecom.tangosol.coherence.rest.internal.Get/class-name
    /user-type

    I had a quick look in the coherence-rest.jar, where com.tangosol.coherence.rest.internal.Get is defined. com.tangosol.coherence.rest.internal.Get implements InvocableMap.EntryProcessor, but not PortableObject.Am I missing something? As far as I can see, InvocableMap.EntryProcessor does not implement PortableObject either.Is there any way to fix this?Thanks

  • RELEVANCY SCORE 2.53

    DB:2.53:Pof Serializer, Problems With Large Dates j8


    Hi,

    I seem to have run into a problem with POF serialization of dates. For some reason the C++ deserialization of dates checks the date to see if it's valid. This seems beyond the bounds of a serializer, however performance aside the POF rawDateTime thinks that the year 1/1/3000 is an invalid date. We have some fields with dates far in the future, I know the database, java, and our c++ can handle these dates.

    The second approach is to serialize the date to a long, however the date can be null. So how do I represent a null date? Maybe I could use 0 or -1, however this introduces a compare for every date we serialize, something which I want to avoid.

    Is there a way of telling the c++ coherence POF serializer that this date is not out of range?

    Cheers
    Rich

    DB:2.53:Pof Serializer, Problems With Large Dates j8

    Hi Rich,

    This issue was identified after 3.6.0 had shipped. While the fix has been made to both the Coherence 3.5 and 3.6 codelines, only the 3.5 version has been made publicly available as a patch. Based on the comments in the issue number, it looks like the 3.6 fix is slated to be part of the upcoming 3.6 SP1 (3.6.1). As for interim work arounds the only thing I can suggest is to encode the value as using the Coherence C++ Integer64 object (equivalent of java.lang.Long), this will allow you to both have a NULL value, as well as a way to encode any date you choose.

    thanks,

    Mark
    Oracle Coherence

  • RELEVANCY SCORE 2.53

    DB:2.53:Missing Pofserializer Configuration 1s


    I am a newbie to coherence and am trying to get the POF working!.

    I am getting the following error. I realize that this has been posted before but the solutions dont seem to help me.

    Coherence 3.6

    (Wrapped: error configuring class "com.tangosol.io.pof.ConfigurablePofContext") java.lang.IllegalStateException: Missing PofSerializer configuration (Config=/cgbu/home4/anasthan/Oracle/Middleware/coherence_3.6/lib/tokens-pof-config.xml, Type-Id=1001, Class-Name=oracle.communications.activation.ace.Token)
    at com.tangosol.coherence.component.util.Daemon.start(Daemon.CDB:52)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.start(Service.CDB:7)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.start(Grid.CDB:6)
    at com.tangosol.coherence.component.util.SafeService.startService(SafeService.CDB:28)
    at com.tangosol.coherence.component.util.safeService.SafeCacheService.startService(SafeCacheService.CDB:5)
    at com.tangosol.coherence.component.util.SafeService.ensureRunningService(SafeService.CDB:27)

    I packaged the oracle.communications.activation.ace.Token class, tokens-pof-config.xml and coherence-cache-config.xml in the Coherence.jar

    Cache-server.sh
    ================
    #!/bin/sh

    # This will start a cache server

    # specify the Coherence installation directory
    COHERENCE_HOME=.

    # specify the JVM heap size
    MEMORY=512m

    if [ ! -f ${COHERENCE_HOME}/bin/cache-server.sh ]; then
    echo "coherence.sh: must be run from the Coherence installation directory."
    exit
    fi

    if [ -f $JAVA_HOME/bin/java ]; then
    JAVAEXEC=$JAVA_HOME/bin/java
    else
    JAVAEXEC=java
    fi

    JAVA_OPTS="-Xms$MEMORY -Xmx$MEMORY"
    CONFIG_FILE_PATH="file:///cgbu/home4/anasthan/Oracle/Middleware/coherence_3.6/lib/tokens-pof-config.xml"

    $JAVAEXEC -server -showversion $JAVA_OPTS -cp "$COHERENCE_HOME/lib/coherence.jar" com.tangosol.net.DefaultCacheServer $1

    Token.java
    =========
    package oracle.communications.activation.ace;

    import java.io.IOException;
    import java.io.Serializable;
    import com.tangosol.io.pof.PortableObject;
    import com.tangosol.io.pof.PofReader;
    import com.tangosol.io.pof.PofWriter;
    import java.sql.*;
    import java.util.Enumeration;

    public class Token implements PortableObject {

    /**
    * 1 - Unassigned
    * 2 - Available
    * 3 - Reserved
    * 4 - defunct
    */
    private int state;

    /**
    * NE-ID(s)
    */
    private String neID;

    /**
    * Number of tokens currently Active
    */
    private int tokensCurrentlyActive;

    /**
    * Max - Number of tokens available
    */
    private int maxTokensAvailable;

    /**
    * unqiue Token ID, used to identify Tokens
    */
    private int tokenID;

    /**
    *
    * POF index for data members
    */
    public static final int TOKENID = 0;
    public static final int STATE = 1;
    public static final int NEID = 2;
    public static final int CURTOKEN = 3;
    public static final int MAXTOKEN = 4;

    /**
    *
    * @param state
    */
    public void setState(int state) {
    this.state = state;
    }

    /**
    *
    * @return
    */
    public int getState() {
    return state;
    }

    /**
    *
    * @param neID
    */
    public void setNeID(String neID) {
    this.neID = neID;
    }

    /**
    *
    * @return
    */
    public String getNeID() {
    return neID;
    }

    /**
    *
    * @param tokensCurrentlyActive
    */
    public void setTokensCurrentlyActive(int tokensCurrentlyActive) {
    this.tokensCurrentlyActive = tokensCurrentlyActive;
    }

    /**
    *
    * @return
    */
    public int getTokensCurrentlyActive() {
    return tokensCurrentlyActive;
    }

    /**
    *
    * @param maxTokensAvailable
    */
    public void setMaxTokensAvailable(int maxTokensAvailable) {
    this.maxTokensAvailable = maxTokensAvailable;
    }

    /**
    *
    * @return
    */
    public int getMaxTokensAvailable() {
    return maxTokensAvailable;
    }

    /**
    *
    * @param tokenID
    */
    public void setTokenID(int tokenID) {
    this.tokenID = tokenID;
    }

    /**
    *
    * @return
    */
    public int getTokenID() {
    return tokenID;
    }

    // ----- PortableObject interface ---------------------------------------

    /**
    * {@inheritDoc}
    */
    public void readExternal(PofReader reader)
    throws IOException
    {
    tokenID = Integer.parseInt(reader.readString(TOKENID));
    neID = reader.readString(NEID);
    tokensCurrentlyActive = Integer.parseInt(reader.readString(CURTOKEN));
    maxTokensAvailable = Integer.parseInt(reader.readString(MAXTOKEN));
    state = Integer.parseInt(reader.readString(STATE));
    }

    /**
    * {@inheritDoc}
    */
    public void writeExternal(PofWriter writer)
    throws IOException
    {
    writer.writeString(TOKENID,Integer.toString(tokenID));
    writer.writeString(NEID,neID);
    writer.writeString(CURTOKEN,Integer.toString(CURTOKEN));
    writer.writeString(MAXTOKEN,Integer.toString(MAXTOKEN));
    writer.writeString(STATE,Integer.toString(STATE));
    }
    }

    tokens-pof-config.xml
    =============

    ?xml version="1.0"?
    pof-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns="http://xmlns.oracle.com/coherence/coherence-pof-config"
    xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-pof-config coherence-pof-config.xsd"

    user-type-list
    !-- coherence POF user types --
    includecoherence-pof-config.xml/include

    !-- com.tangosol.examples package --
    user-type
    type-id1001/type-id
    class-nameoracle.communications.activation.ace.Token/class-name
    /user-type
    /user-type-list
    allow-interfacestrue/allow-interfaces
    allow-subclassestrue/allow-subclasses
    /pof-config

    coherence-cache-config.xml
    =================
    distributed-scheme
    scheme-nameexample-distributed/scheme-name
    service-nameDistributedCache/service-name
    serializer
    instance
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    init-params
    init-param
    param-typeString/param-type
    param-value system-property="tokens-pof-config.xml"/cgbu/home4/anasthan/Oracle/Middleware/coherence_3.6/lib/tokens-pof-config.xml/param-value
    /init-param
    /init-params
    /instance
    /serializer
    backing-map-scheme
    local-scheme
    scheme-refexample-binary-backing-map/scheme-ref
    /local-scheme
    /backing-map-scheme
    autostarttrue/autostart
    /distributed-scheme

    I am not sure what is wrong, please help!

    DB:2.53:Missing Pofserializer Configuration 1s

    Hi,

    First of all, it is not recommended to load your application specific configuration and classes in the coherence.jar though it should work. Try doing the following:

    1. Create a new jar and put your Token class inside it; Modify your cache-server.sh to include this new jar in your classpath
    2. Add a property -Dtangosol.coherence.cacheconfig=location of your cache configuration in your cache-server.sh

    $JAVAEXEC -server -showversion $JAVA_OPTS -Dtangosol.coherence.cacheconfig=location of your cache configuration -cp "$COHERENCE_HOME/lib/coherence.jar:+new jar loaction+" com.tangosol.net.DefaultCacheServer $1

    You need not load into classpath or use any property for pof config as its location is mentioned already in your cache configuration

    Hope this helps!

    Cheers,
    NJ

    Edited by: user738616 on Oct 13, 2011 8:34 AM

  • RELEVANCY SCORE 2.53

    DB:2.53:Problem Getting Pof Object But No Trouble Querying 1c


    I'm working with a C++ client and have defined a POF object derived from portable object that consists of 3 strings, 1 long and a byte array.

    I am able to put and get these objects into and from a cache.

    I am also able to query these objects using a PofExtractor/filter combination in both the Java server and a Java client. (To do this, I created a Java version of the object and added an entry to pof config file which is referenced in the serializer node defining com.tangosol.io.pof.ConfigurablePofContext).

    However, when I try to get an object using Java then I get a 'java.io.StreamCorruptedException: unknown user type' exception.

    Specifically

    ret = (MyObj) cache1.get(key);

    gives

    Exception in thread "main" (Wrapped) java.io.StreamCorruptedException: unknown user type: 1111
    at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:266)
    at com.tangosol.coherence.component.net.extend.RemoteNamedCache$ConverterFromBinary.convert(RemoteNamedCache.CDB:4)
    at com.tangosol.util.ConverterCollections$ConverterMap.get(ConverterCollections.java:1559)
    at com.tangosol.coherence.component.net.extend.RemoteNamedCache.get(RemoteNamedCache.CDB:1)
    at com.tangosol.coherence.component.util.SafeNamedCache.get(SafeNamedCache.CDB:1)
    at Client.protoCacheClient.get(protoCacheClient.java:88)
    at Client.protoCacheClient.runOpt(protoCacheClient.java:76)
    at Client.protoCacheClient.main(protoCacheClient.java:37)
    Caused by: java.io.StreamCorruptedException: unknown user type: 1111
    at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3302)
    at com.tangosol.io.pof.PofBufferReader.readObject(PofBufferReader.java:2603)
    at com.tangosol.io.pof.ConfigurablePofContext.deserialize(ConfigurablePofContext.java:358)
    at com.tangosol.util.ExternalizableHelper.deserializeInternal(ExternalizableHelper.java:2708)
    at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:262)

    Can you tell me where I might be going wrong here? I checked that the serialization IDs all match but I'm not sure where to go after that. Thanks!

    DB:2.53:Problem Getting Pof Object But No Trouble Querying 1c

    Figured it out-- there was a mismatch in the type IDs on the C++ side compared to the Java side.

    COH_REGISTER_PORTABLE_CLASS(1001, MDSCacheDataPO);

    Good to go now.

    It looks like querying works because the object is not deserialized.

  • RELEVANCY SCORE 2.53

    DB:2.53:Creating Extractors When Pof Serialization Is Enabled 7d


    We are currently trying to move our application from XmlBean as the serialization mechanism to Pof. I am facing some issues with the extractors that we currently have in our application. For using POF serialization we are Implementing the PofSerializer Interface.

    I have an extractor like this :

    public class TransactionExtractor extends AbstractExtractor {

    private static final long serialVersionUID = -1L;

    @Override
    public Object extract(Object obj) {
    Transaction transaction = null;
    if (obj instanceof Trade ) {
    Trade Trade = (Trade)obj;
    transaction = new Transaction();
    transaction.setPrice(Trade.getPrice());
    }
    return transaction;
    }

    }

    ----------------

    The object 'Transaction' is :

    public class Transaction extends XmlBean{
    private Double price;

    public Double getPrice() {
    return price;
    }

    public void setPrice(Double price) {
    this.price = price;
    }

    }

    --------------------------------------------

    Now I execute the following test case :

    String key = "key";
    NamedCache cache = CacheFactory.getCache(this.getClass().getName());
    cache.put(key, createTrade());

    Map queryResultMap = cache.invokeAll(new AlwaysFilter(), new ExtractorProcessor(new TransactionExtractor()));
    Transaction transaction = (Transaction) queryResultMap.get(key);
    assertNotNull(transaction);

    This test case passes without any problems. I was expecting this to fail as the extractor is not portable. Is it not necessary that my Extractor (TransactionExtractor ) be portable ? Or is it that this is happening in the same JVM and hence there was no need to serialize and deserialize ?

    If I want to make this Extractor a PofExtractor, will having something like this work or I need to do something extra also:

    public class TransactionExtractor extends PofExtractor {

    ----
    }

    Edited by: NewBie on 11-Oct-2011 02:07

    Edited by: NewBie on 11-Oct-2011 02:07

    Edited by: NewBie on 11-Oct-2011 02:08

  • RELEVANCY SCORE 2.53

    DB:2.53:Overriding Eventdistributionpattern-Pof Throwing Unable To Load Class Error 91


    Hi,We were trying to override the "coherence-eventdistributionpattern-pof-config.xml" with our own event distribution pof due to conflicintg type-id's. We changed type-id's for few of the classes. We changed the type-id for the following classes and left others as such.com.oracle.coherence.patterns.eventdistribution.EventDistributor$Identifiercom.oracle.coherence.patterns.eventdistribution.events.DistributableEntrycom.oracle.coherence.patterns.eventdistribution.events.DistributableEntryInsertedEventcom.oracle.coherence.patterns.eventdistribution.events.DistributableEntryUpdatedEventcom.oracle.coherence.patterns.eventdistribution.events.DistributableEntryRemovedEventWhen i start the coherence server, i get error in loading class "com.oracle.coherence.patterns.eventdistribution.channels.cache.ParallelLocalCacheEventChannelBuilder" for which we did not change the typ-id at all.Coherence version 3.7.10Error Trace:(thread=DistributedCache:DistributedCacheForSequenceGenerators, member=2): PartitionedCache caught an unhandled exception (com.tangosol.util.WrapperException: (Wrapped: error creating class "com.tangosol.io.pof.ConfigurablePofContext") (Wrapped: Unable to load class for user type (Config=test/coherence/grid/obj-pof-config.xml, Type-Id=13402, Class-Name=com.oracle.coherence.patterns.eventdistribution.channels.cache.ParallelLocalCacheEventChannelBuilder)) (Wrapped) com.oracle.coherence.patterns.eventdistribution.channels.cache.ParallelLocalCacheEventChannelBuilder) while exiting.Error (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=2): ClusterService.doServiceLeft: Unknown Service PartitionedCache{Name=DistributedCacheForSequenceGenerators, State=(SERVICE_STOPPED), Not initialized}(thread=DistributedCache:DistributedCacheForSequenceGenerators, member=2): Service DistributedCacheForSequenceGenerators left the clusterError (thread=main, member=2): Error while starting service "DistributedCacheForSequenceGenerators": (Wrapped) (Wrapped: error creating class "com.tangosol.io.pof.ConfigurablePofContext") (Wrapped: Unable to load class for user type (Config=test/coherence/grid/obj-pof-config.xml, Type-Id=13402, Class-Name=com.oracle.coherence.patterns.eventdistribution.channels.cache.ParallelLocalCacheEventChannelBuilder)) (Wrapped) java.lang.ClassNotFoundException: com.oracle.coherence.patterns.eventdistribution.channels.cache.ParallelLocalCacheEventChannelBuilder at com.tangosol.coherence.component.util.Daemon.start(Daemon.CDB:52)Thanks,

    DB:2.53:Overriding Eventdistributionpattern-Pof Throwing Unable To Load Class Error 91

    Hi,We were trying to override the "coherence-eventdistributionpattern-pof-config.xml" with our own event distribution pof due to conflicintg type-id's. We changed type-id's for few of the classes. We changed the type-id for the following classes and left others as such.com.oracle.coherence.patterns.eventdistribution.EventDistributor$Identifiercom.oracle.coherence.patterns.eventdistribution.events.DistributableEntrycom.oracle.coherence.patterns.eventdistribution.events.DistributableEntryInsertedEventcom.oracle.coherence.patterns.eventdistribution.events.DistributableEntryUpdatedEventcom.oracle.coherence.patterns.eventdistribution.events.DistributableEntryRemovedEventWhen i start the coherence server, i get error in loading class "com.oracle.coherence.patterns.eventdistribution.channels.cache.ParallelLocalCacheEventChannelBuilder" for which we did not change the typ-id at all.Coherence version 3.7.10Error Trace:(thread=DistributedCache:DistributedCacheForSequenceGenerators, member=2): PartitionedCache caught an unhandled exception (com.tangosol.util.WrapperException: (Wrapped: error creating class "com.tangosol.io.pof.ConfigurablePofContext") (Wrapped: Unable to load class for user type (Config=test/coherence/grid/obj-pof-config.xml, Type-Id=13402, Class-Name=com.oracle.coherence.patterns.eventdistribution.channels.cache.ParallelLocalCacheEventChannelBuilder)) (Wrapped) com.oracle.coherence.patterns.eventdistribution.channels.cache.ParallelLocalCacheEventChannelBuilder) while exiting.Error (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=2): ClusterService.doServiceLeft: Unknown Service PartitionedCache{Name=DistributedCacheForSequenceGenerators, State=(SERVICE_STOPPED), Not initialized}(thread=DistributedCache:DistributedCacheForSequenceGenerators, member=2): Service DistributedCacheForSequenceGenerators left the clusterError (thread=main, member=2): Error while starting service "DistributedCacheForSequenceGenerators": (Wrapped) (Wrapped: error creating class "com.tangosol.io.pof.ConfigurablePofContext") (Wrapped: Unable to load class for user type (Config=test/coherence/grid/obj-pof-config.xml, Type-Id=13402, Class-Name=com.oracle.coherence.patterns.eventdistribution.channels.cache.ParallelLocalCacheEventChannelBuilder)) (Wrapped) java.lang.ClassNotFoundException: com.oracle.coherence.patterns.eventdistribution.channels.cache.ParallelLocalCacheEventChannelBuilder at com.tangosol.coherence.component.util.Daemon.start(Daemon.CDB:52)Thanks,

  • RELEVANCY SCORE 2.53

    DB:2.53:Portable Object Format 71


    Hi All ,I am working on portable object format with coherence my requirement is to build portable object format using java so every platform call that object..
    please tell me when ever i use to start the coherence with my cache name its showing like pof-config file unable to find how to resolve this error is there any wrong in my cohernce cache config if there pls tell me ...
    is it correct way to built portable object .

    *1.Created the java class ( Manager.java) -- portable object interface*
    package com.Manager;

    import java.io.IOException;

    import com.tangosol.io.pof.PofReader;
    import com.tangosol.io.pof.PofWriter;
    import com.tangosol.io.pof.PortableObject;

    public class Manager implements PortableObject
    {
    private String m_id ;
    private String m_name ;

    public Manager()
    {

    }

    public Manager(String m_id,String m_name)
    {
    this.m_id=m_id;
    this.m_name=m_name;
    }

    public String getId()
    {
    return m_id;
    }
    public void setId(String id)
    {
    m_id = id;
    }

    public String getName()
    {
    return m_name;
    }
    public void setName(String name)
    {
    m_name = name;
    }
    @Override
    public void readExternal(PofReader arg0) throws IOException {
    // TODO Auto-generated method stub
    m_id = arg0.readString(0);
    m_name = arg0.readString(1);
    }
    @Override
    public void writeExternal(PofWriter arg0) throws IOException {
    arg0.writeString(0, m_id);
    arg0.writeString(1, m_name);
    }
    }
    *2. created the (manager2.java)*
    package com.Manager;

    import java.io.IOException;

    import com.tangosol.io.pof.PofReader;
    import com.tangosol.io.pof.PofWriter;

    public class Manager2 --Pof Initilizer
    {
    public void serialize(PofWriter out, Object o)
    throws IOException
    {
    Manager trade = (Manager) o;
    out.writeObject(0, trade.getId());
    out.writeString(1, trade.getName());

    // mark that writing the object is done
    out.writeRemainder(null);
    }

    public Object deserialize(PofReader in)
    throws IOException
    {

    String ldtPlaced = in.readString(0);
    String name = in.readString(1);

    // mark that reading the object is done
    in.readRemainder();

    return new Manager(ldtPlaced,name);
    }
    }
    *3.created another java class to cput the data into cache*
    package com.Manager;

    import com.Manager.Manager;
    import com.tangosol.net.CacheFactory;
    import com.tangosol.net.NamedCache;

    public class Main {

    /**
    * @param args
    */
    public static void main(String[] args) {
    // TODO Auto-generated method stub

    String m_id = "123";
    String m_name ="laxman";
    CacheFactory.ensureCluster();
    NamedCache cache = CacheFactory.getCache("hello");
    /*Manager m1 = new Manager();
    m1.setId("1");
    m1.setName("laxman");*/
    cache.put(m_id,m_name);

    }

    }
    **4.created coherence-cache-config.xml**
    !DOCTYPE cache-config SYSTEM "cache-config.dtd"
    cache-config

    caching-scheme-mapping
    cache-mapping
    cache-namehello/cache-name
    scheme-nameDistributed/scheme-name
    /cache-mapping
    /caching-scheme-mapping
    caching-schemes
    distributed-scheme
    scheme-nameDistributed/scheme-name
    service-nameDistributedCache/service-name
    serializer
    instance
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    init-params
    init-param
    param-typeString/param-type
    param-valuepof-config.xml/param-value
    /init-param
    /init-params
    /instance
    /serializer
    backing-map-scheme
    local-scheme
    !-- each node will be limited to 250MB --
    high-units250M/high-units
    unit-calculatorbinary/unit-calculator
    /local-scheme
    /backing-map-scheme
    autostarttrue/autostart
    /distributed-scheme
    /caching-schemes
    /cache-config

    ****5.created coherence-pof-config.xml****

    ?xml version="1.0"?
    !DOCTYPE pof-config SYSTEM "pof-config.dtd"
    pof-config

    user-type-list
    includeManager-pof-config.xml/include
    user-type
    type-id1000/type-id
    class-namecom.Manager.Manager/class-name
    serializer
    class-namecom.Manager.Manager2/class-name
    /serializer
    /user-type
    /user-type-list
    /pof-config

    ****6.My Cache-server.cmd****

    @echo off
    @
    @rem This will start a cache server
    @
    setlocal

    :config
    @rem specify the Coherence installation directory
    set coherence_home=%~dp0\..
    @rem specify the JVM heap size
    set memory=512m

    :start
    if not exist "%coherence_home%\lib\coherence.jar" goto instructions

    if "%java_home%"=="" (set java_exec=java) else (set java_exec=%java_home%\bin\java)

    :launch

    set java_opts="-Xms%memory% -Xmx%memory% -Dtangosol.pof.config=file:/C:/Users/lakshmana/JPACoherenceWorkspace/Application/appClientModule/Manager-pof-config.xml -Dtangosol.coherence.cacheconfig=file:/C:/Users/lakshmana/JPACoherenceWorkspace/Application/appClientModule/coherence-cache-config.xml"

    "%java_exec%" -server -showversion "%java_opts%" -cp "%coherence_home%\lib\coherence.jar" com.tangosol.net.DefaultCacheServer %1

    goto exit

    :instructions

    echo Usage:
    echo ^coherence_home^\bin\cache-server.cmd
    goto exit

    :exit
    endlocal
    @echo on

    ***7.Coherence-cache-server.cmd***

    @echo off
    @
    @rem This will start a console application
    @rem demonstrating the functionality of the Coherence(tm) API
    @
    setlocal

    :config
    @rem specify the Coherence installation directory
    set coherence_home=%~dp0\..

    @rem specify if the console will also act as a server
    set storage_enabled=false

    @rem specify the JVM heap size
    set memory=64m

    :start
    if not exist "%coherence_home%\lib\coherence.jar" goto instructions

    if "%java_home%"=="" (set java_exec=java) else (set java_exec=%java_home%\bin\java)

    :launch

    if "%storage_enabled%"=="true" (echo ** Starting storage enabled console **) else (echo ** Starting storage disabled console **)

    set java_opts="-Xms%memory% -Xmx%memory% -Dtangosol.coherence.distributed.localstorage=true -Dtangosol.coherence.cacheconfig=file:/C:/Users/Praveen/workspace/SimpleCacheApp/build/classes/coherence-cache-config.xml -Dtangosol.coherence.override=file:/C:/Users/Praveen/workspace/SimpleCacheApp/src/tangosol-coherence-override.xml"

    "%java_exec%" -server -showversion "%java_opts%" -cp "%coherence_home%\lib\coherence.jar" com.tangosol.net.CacheFactory %1

    goto exit

    :instructions

    echo Usage:
    echo ^coherence_home^\bin\coherence.cmd
    goto exit

    :exit
    endlocal
    @echo on

    **cache server working fine , When i am trying to start the coherence and entered cache and mycache name hello its showing following error**

    Map (?): cache hello
    2012-06-14 22:48:02.475/7.590 Oracle Coherence GE 3.6.0.4 Info (thread=main, member=11): Loaded cache configuration from "file:/C:/Users/lakshmana/JPACoherenceWorkspace/Application/appClientModule/c
    oherence-cache-config.xml"
    2012-06-14 22:48:02.609/7.724 Oracle Coherence GE 3.6.0.4 D4 (thread=DistributedCache, member=11): PartitionedCache caught an unhandled exception (com.tangosol.util.WrapperException: (Wrapped: error
    configuring class "com.tangosol.io.pof.ConfigurablePofContext") (Wrapped: Failed to load POF configuration: Manager-config.xml) The POF configuration is missing: "Manager-config.xml", loader=sun.misc
    .Launcher$AppClassLoader@6d6f0472) while exiting.
    2012-06-14 22:48:02.610/7.725 Oracle Coherence GE 3.6.0.4 Error (thread=DistributedCache, member=11): ClusterService.doServiceLeft: Unknown Service PartitionedCache{Name=DistributedCache, State=(SER
    VICE_STOPPED), Not initialized}
    2012-06-14 22:48:02.610/7.725 Oracle Coherence GE 3.6.0.4 D5 (thread=DistributedCache, member=11): Service DistributedCache left the cluster
    2012-06-14 22:48:02.625/7.740 Oracle Coherence GE 3.6.0.4 Error (thread=main, member=11): Error while starting service "DistributedCache": (Wrapped) (Wrapped: error configuring class "com.tangosol.i
    o.pof.ConfigurablePofContext") (Wrapped: Failed to load POF configuration: Manager-config.xml) java.io.IOException: The POF configuration is missing: "Manager-config.xml", loader=sun.misc.Launcher$App
    ClassLoader@6d6f0472
    at com.tangosol.coherence.component.util.Daemon.start(Daemon.CDB:52)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.start(Service.CDB:7)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.start(Grid.CDB:6)
    at com.tangosol.coherence.component.util.SafeService.startService(SafeService.CDB:28)
    at com.tangosol.coherence.component.util.safeService.SafeCacheService.startService(SafeCacheService.CDB:5)
    at com.tangosol.coherence.component.util.SafeService.ensureRunningService(SafeService.CDB:27)
    at com.tangosol.coherence.component.util.SafeService.start(SafeService.CDB:14)
    at com.tangosol.net.DefaultConfigurableCacheFactory.ensureServiceInternal(DefaultConfigurableCacheFactory.java:1057)
    at com.tangosol.net.DefaultConfigurableCacheFactory.ensureService(DefaultConfigurableCacheFactory.java:892)
    at com.tangosol.net.DefaultConfigurableCacheFactory.ensureCache(DefaultConfigurableCacheFactory.java:874)
    at com.tangosol.net.DefaultConfigurableCacheFactory.configureCache(DefaultConfigurableCacheFactory.java:1231)
    at com.tangosol.net.DefaultConfigurableCacheFactory.ensureCache(DefaultConfigurableCacheFactory.java:290)
    at com.tangosol.net.CacheFactory.getCache(CacheFactory.java:735)
    at com.tangosol.coherence.component.application.console.Coherence.doCache(Coherence.CDB:18)
    at com.tangosol.coherence.component.application.console.Coherence.processCommand(Coherence.CDB:209)
    at com.tangosol.coherence.component.application.console.Coherence.run(Coherence.CDB:37)
    at com.tangosol.coherence.component.application.console.Coherence.main(Coherence.CDB:3)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at com.tangosol.net.CacheFactory.main(CacheFactory.java:1400)
    Caused by: (Wrapped: error configuring class "com.tangosol.io.pof.ConfigurablePofContext") (Wrapped: Failed to load POF configuration: Manager-config.xml) java.io.IOException: The POF configuration is
    missing: "Manager-config.xml", loader=sun.misc.Launcher$AppClassLoader@6d6f0472
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.instantiateSerializer(Service.CDB:17)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.ensureSerializer(Service.CDB:31)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.ensureSerializer(Service.CDB:4)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onEnter(Grid.CDB:26)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onEnter(PartitionedService.CDB:19)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:14)
    at java.lang.Thread.run(Thread.java:662)
    Caused by: (Wrapped: Failed to load POF configuration: Manager-config.xml) java.io.IOException: The POF configuration is missing: "Manager-config.xml", loader=sun.misc.Launcher$AppClassLoader@6d6f0472

    at com.tangosol.util.Base.ensureRuntimeException(Base.java:293)
    at com.tangosol.run.xml.XmlHelper.loadResourceInternal(XmlHelper.java:330)
    at com.tangosol.run.xml.XmlHelper.loadFileOrResource(XmlHelper.java:281)
    at com.tangosol.io.pof.ConfigurablePofContext.createPofConfig(ConfigurablePofContext.java:813)
    at com.tangosol.io.pof.ConfigurablePofContext.initialize(ConfigurablePofContext.java:775)
    at com.tangosol.io.pof.ConfigurablePofContext.setContextClassLoader(ConfigurablePofContext.java:319)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.instantiateSerializer(Service.CDB:13)
    ... 6 more
    Caused by: java.io.IOException: The POF configuration is missing: "Manager-config.xml", loader=sun.misc.Launcher$AppClassLoader@6d6f0472
    at com.tangosol.run.xml.XmlHelper.loadResourceInternal(XmlHelper.java:316)
    ... 11 more

    2012-06-14 22:48:02.625/7.740 Oracle Coherence GE 3.6.0.4 Error (thread=main, member=11):
    (Wrapped) (Wrapped: error configuring class "com.tangosol.io.pof.ConfigurablePofContext") (Wrapped: Failed to load POF configuration: Manager-config.xml) java.io.IOException: The POF configuration is
    missing: "Manager-config.xml", loader=sun.misc.Launcher$AppClassLoader@6d6f0472
    at com.tangosol.coherence.component.util.Daemon.start(Daemon.CDB:52)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.start(Service.CDB:7)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.start(Grid.CDB:6)
    at com.tangosol.coherence.component.util.SafeService.startService(SafeService.CDB:28)
    at com.tangosol.coherence.component.util.safeService.SafeCacheService.startService(SafeCacheService.CDB:5)
    at com.tangosol.coherence.component.util.SafeService.ensureRunningService(SafeService.CDB:27)
    at com.tangosol.coherence.component.util.SafeService.start(SafeService.CDB:14)
    at com.tangosol.net.DefaultConfigurableCacheFactory.ensureServiceInternal(DefaultConfigurableCacheFactory.java:1057)
    at com.tangosol.net.DefaultConfigurableCacheFactory.ensureService(DefaultConfigurableCacheFactory.java:892)
    at com.tangosol.net.DefaultConfigurableCacheFactory.ensureCache(DefaultConfigurableCacheFactory.java:874)
    at com.tangosol.net.DefaultConfigurableCacheFactory.configureCache(DefaultConfigurableCacheFactory.java:1231)
    at com.tangosol.net.DefaultConfigurableCacheFactory.ensureCache(DefaultConfigurableCacheFactory.java:290)
    at com.tangosol.net.CacheFactory.getCache(CacheFactory.java:735)
    at com.tangosol.coherence.component.application.console.Coherence.doCache(Coherence.CDB:18)
    at com.tangosol.coherence.component.application.console.Coherence.processCommand(Coherence.CDB:209)
    at com.tangosol.coherence.component.application.console.Coherence.run(Coherence.CDB:37)
    at com.tangosol.coherence.component.application.console.Coherence.main(Coherence.CDB:3)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at com.tangosol.net.CacheFactory.main(CacheFactory.java:1400)
    Caused by: (Wrapped: error configuring class "com.tangosol.io.pof.ConfigurablePofContext") (Wrapped: Failed to load POF configuration: Manager-config.xml) java.io.IOException: The POF configuration is
    missing: "Manager-config.xml", loader=sun.misc.Launcher$AppClassLoader@6d6f0472
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.instantiateSerializer(Service.CDB:17)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.ensureSerializer(Service.CDB:31)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.ensureSerializer(Service.CDB:4)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onEnter(Grid.CDB:26)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onEnter(PartitionedService.CDB:19)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:14)
    at java.lang.Thread.run(Thread.java:662)
    Caused by: (Wrapped: Failed to load POF configuration: Manager-config.xml) java.io.IOException: The POF configuration is missing: "Manager-config.xml", loader=sun.misc.Launcher$AppClassLoader@6d6f0472

    at com.tangosol.util.Base.ensureRuntimeException(Base.java:293)
    at com.tangosol.run.xml.XmlHelper.loadResourceInternal(XmlHelper.java:330)
    at com.tangosol.run.xml.XmlHelper.loadFileOrResource(XmlHelper.java:281)
    at com.tangosol.io.pof.ConfigurablePofContext.createPofConfig(ConfigurablePofContext.java:813)
    at com.tangosol.io.pof.ConfigurablePofContext.initialize(ConfigurablePofContext.java:775)
    at com.tangosol.io.pof.ConfigurablePofContext.setContextClassLoader(ConfigurablePofContext.java:319)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.instantiateSerializer(Service.CDB:13)
    ... 6 more
    Caused by: java.io.IOException: The POF configuration is missing: "Manager-config.xml", loader=sun.misc.Launcher$AppClassLoader@6d6f0472
    at com.tangosol.run.xml.XmlHelper.loadResourceInternal(XmlHelper.java:316)
    ... 11 more

    Please tell me how to approach this task.......

    Thanks

    Edited by: 875910 on Jun 15, 2012 12:48 PM

    Edited by: 875910 on Jun 15, 2012 1:14 PM

    DB:2.53:Portable Object Format 71

    hi Jon thanks for reply ...actually i have modified what u said in the post , but still i am getting the same error .............

    is it correct way to build a portable object , its by using java .....

    is it when i remove serializer in my coherence-ccache-config ..its working fine ,

    Is there any mistake in this config ....if i used this serializer ,getting following error , this config is must to implement pof concept...
    serializer
    instance
    class-namecom.tangosol.io.pof.ConfigurablePofContext/class-name
    init-params
    init-param
    param-typeString/param-type
    param-valueManager-pof-config.xml/param-value
    /init-param
    /init-params
    /instance
    /serializer

    Map (?): cache hello
    2012-06-15 16:09:27.651/5.755 Oracle Coherence GE 3.6.0.4 Info (thread=main, member=13): Loaded cache configuration from "file:/C:/Users/lakshmana/J
    oherence-cache-config.xml"
    2012-06-15 16:09:27.784/5.888 Oracle Coherence GE 3.6.0.4 Info (thread=DistributedCache:DistributedCacheService, member=13): Loaded POF configuratio
    pace/Application/appClientModule/Manager-pof-config.xml"
    2012-06-15 16:09:27.784/5.888 Oracle Coherence GE 3.6.0.4 D4 (thread=DistributedCache:DistributedCacheService, member=13): PartitionedCache caught a
    ception: (Wrapped: error configuring class "com.tangosol.io.pof.ConfigurablePofContext") (Wrapped: Failed to load included POF configuration: Manager-
    missing: "Manager-pof-config.xml", loader=sun.misc.Launcher$AppClassLoader@6d6f0472) while exiting.
    2012-06-15 16:09:27.785/5.889 Oracle Coherence GE 3.6.0.4 Error (thread=DistributedCache:DistributedCacheService, member=13): ClusterService.doServi
    ributedCacheService, State=(SERVICE_STOPPED), Not initialized}
    2012-06-15 16:09:27.786/5.890 Oracle Coherence GE 3.6.0.4 D5 (thread=DistributedCache:DistributedCacheService, member=13): Service DistributedCacheS
    2012-06-15 16:09:27.789/5.893 Oracle Coherence GE 3.6.0.4 Error (thread=main, member=13): Error while starting service "DistributedCacheService": (W
    gosol.io.pof.ConfigurablePofContext") (Wrapped: Failed to load included POF configuration: Manager-pof-config.xml) java.io.IOException: The included P
    ml", loader=sun.misc.Launcher$AppClassLoader@6d6f0472
    at com.tangosol.coherence.component.util.Daemon.start(Daemon.CDB:52)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.start(Service.CDB:7)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.start(Grid.CDB:6)
    at com.tangosol.coherence.component.util.SafeService.startService(SafeService.CDB:28)
    at com.tangosol.coherence.component.util.safeService.SafeCacheService.startService(SafeCacheService.CDB:5)
    at com.tangosol.coherence.component.util.SafeService.ensureRunningService(SafeService.CDB:27)
    at com.tangosol.coherence.component.util.SafeService.start(SafeService.CDB:14)
    at com.tangosol.net.DefaultConfigurableCacheFactory.ensureServiceInternal(DefaultConfigurableCacheFactory.java:1057)
    at com.tangosol.net.DefaultConfigurableCacheFactory.ensureService(DefaultConfigurableCacheFactory.java:892)
    at com.tangosol.net.DefaultConfigurableCacheFactory.ensureCache(DefaultConfigurableCacheFactory.java:874)
    at com.tangosol.net.DefaultConfigurableCacheFactory.configureCache(DefaultConfigurableCacheFactory.java:1231)
    at com.tangosol.net.DefaultConfigurableCacheFactory.ensureCache(DefaultConfigurableCacheFactory.java:290)
    at com.tangosol.net.CacheFactory.getCache(CacheFactory.java:735)
    at com.tangosol.coherence.component.application.console.Coherence.doCache(Coherence.CDB:18)
    at com.tangosol.coherence.component.application.console.Coherence.processCommand(Coherence.CDB:209)
    at com.tangosol.coherence.component.application.console.Coherence.run(Coherence.CDB:37)
    at com.tangosol.coherence.component.application.console.Coherence.main(Coherence.CDB:3)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at com.tangosol.net.CacheFactory.main(CacheFactory.java:1400)
    Caused by: (Wrapped: error configuring class "com.tangosol.io.pof.ConfigurablePofContext") (Wrapped: Failed to load included POF configuration: Manage
    d POF configuration is missing: "Manager-pof-config.xml", loader=sun.misc.Launcher$AppClassLoader@6d6f0472
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.instantiateSerializer(Service.CDB:17)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.ensureSerializer(Service.CDB:31)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.ensureSerializer(Service.CDB:4)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onEnter(Grid.CDB:26)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onEnter(PartitionedService.CDB:19)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:14)
    at java.lang.Thread.run(Thread.java:662)
    Caused by: (Wrapped: Failed to load included POF configuration: Manager-pof-config.xml) java.io.IOException: The included POF configuration is missing
    er$AppClassLoader@6d6f0472
    at com.tangosol.util.Base.ensureRuntimeException(Base.java:293)
    at com.tangosol.run.xml.XmlHelper.loadResourceInternal(XmlHelper.java:330)
    at com.tangosol.run.xml.XmlHelper.loadFileOrResource(XmlHelper.java:281)
    at com.tangosol.io.pof.ConfigurablePofContext.createPofConfig(ConfigurablePofContext.java:856)
    at com.tangosol.io.pof.ConfigurablePofContext.initialize(ConfigurablePofContext.java:775)
    at com.tangosol.io.pof.ConfigurablePofContext.setContextClassLoader(ConfigurablePofContext.java:319)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.instantiateSerializer(Service.CDB:13)
    ... 6 more
    Caused by: java.io.IOException: The included POF configuration is missing: "Manager-pof-config.xml", loader=sun.misc.Launcher$AppClassLoader@6d6f0472
    at com.tangosol.run.xml.XmlHelper.loadResourceInternal(XmlHelper.java:316)
    ... 11 more

    2012-06-15 16:09:27.789/5.893 Oracle Coherence GE 3.6.0.4 Error (thread=main, member=13):
    (Wrapped) (Wrapped: error configuring class "com.tangosol.io.pof.ConfigurablePofContext") (Wrapped: Failed to load included POF configuration: Manager
    POF configuration is missing: "Manager-pof-config.xml", loader=sun.misc.Launcher$AppClassLoader@6d6f0472
    at com.tangosol.coherence.component.util.Daemon.start(Daemon.CDB:52)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.start(Service.CDB:7)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.start(Grid.CDB:6)
    at com.tangosol.coherence.component.util.SafeService.startService(SafeService.CDB:28)
    at com.tangosol.coherence.component.util.safeService.SafeCacheService.startService(SafeCacheService.CDB:5)
    at com.tangosol.coherence.component.util.SafeService.ensureRunningService(SafeService.CDB:27)
    at com.tangosol.coherence.component.util.SafeService.start(SafeService.CDB:14)
    at com.tangosol.net.DefaultConfigurableCacheFactory.ensureServiceInternal(DefaultConfigurableCacheFactory.java:1057)
    at com.tangosol.net.DefaultConfigurableCacheFactory.ensureService(DefaultConfigurableCacheFactory.java:892)
    at com.tangosol.net.DefaultConfigurableCacheFactory.ensureCache(DefaultConfigurableCacheFactory.java:874)
    at com.tangosol.net.DefaultConfigurableCacheFactory.configureCache(DefaultConfigurableCacheFactory.java:1231)
    at com.tangosol.net.DefaultConfigurableCacheFactory.ensureCache(DefaultConfigurableCacheFactory.java:290)
    at com.tangosol.net.CacheFactory.getCache(CacheFactory.java:735)
    at com.tangosol.coherence.component.application.console.Coherence.doCache(Coherence.CDB:18)
    at com.tangosol.coherence.component.application.console.Coherence.processCommand(Coherence.CDB:209)
    at com.tangosol.coherence.component.application.console.Coherence.run(Coherence.CDB:37)
    at com.tangosol.coherence.component.application.console.Coherence.main(Coherence.CDB:3)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at com.tangosol.net.CacheFactory.main(CacheFactory.java:1400)
    Caused by: (Wrapped: error configuring class "com.tangosol.io.pof.ConfigurablePofContext") (Wrapped: Failed to load included POF configuration: Manage
    d POF configuration is missing: "Manager-pof-config.xml", loader=sun.misc.Launcher$AppClassLoader@6d6f0472
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.instantiateSerializer(Service.CDB:17)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.ensureSerializer(Service.CDB:31)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.ensureSerializer(Service.CDB:4)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onEnter(Grid.CDB:26)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onEnter(PartitionedService.CDB:19)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:14)
    at java.lang.Thread.run(Thread.java:662)
    Caused by: (Wrapped: Failed to load included POF configuration: Manager-pof-config.xml) java.io.IOException: The included POF configuration is missing
    er$AppClassLoader@6d6f0472
    at com.tangosol.util.Base.ensureRuntimeException(Base.java:293)
    at com.tangosol.run.xml.XmlHelper.loadResourceInternal(XmlHelper.java:330)
    at com.tangosol.run.xml.XmlHelper.loadFileOrResource(XmlHelper.java:281)
    at com.tangosol.io.pof.ConfigurablePofContext.createPofConfig(ConfigurablePofContext.java:856)
    at com.tangosol.io.pof.ConfigurablePofContext.initialize(ConfigurablePofContext.java:775)
    at com.tangosol.io.pof.ConfigurablePofContext.setContextClassLoader(ConfigurablePofContext.java:319)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.instantiateSerializer(Service.CDB:13)
    ... 6 more
    Caused by: java.io.IOException: The included POF configuration is missing: "Manager-pof-config.xml", loader=sun.misc.Launcher$AppClassLoader@6d6f0472
    at com.tangosol.run.xml.XmlHelper.loadResourceInternal(XmlHelper.java:316)
    ... 11 more

    Map (?):

    Thanks..

  • RELEVANCY SCORE 2.53

    DB:2.53:Xml Serialization Error- While Testing Bapi Turned Web Service as



    I have a requirement to create sales order in SAP R/3 from an e-commerce site. I went through many forums suggesting "exposing FMs into Web Service". I wrapped BAPI_SALESORDER_CREATEFROMDAT2 and BAPI_TRANSACTION_COMMIT into one FM and exposed as Web Service. I did a successful test-sequence.

    When I tested the web service without giving values I got a response asking for "Sold-to Party or Ship-To Party". While testing the Web service with some values, I got the below error -

    XML Serialization Error. Object content does not correspond to Schema restrictions of type [urn:sap-com:document:sap:rfc:functions][numeric4].

    DB:2.53:Xml Serialization Error- While Testing Bapi Turned Web Service as


    Thanks Gaurav!

    XML serialization error is resolved.

    I'll raise another thread for the Eclipse Validation Error.

  • RELEVANCY SCORE 2.53

    DB:2.53:How Can I Get Wrapperexception From Storagenode Or Cacheloader 7s


    Hi,

    We are facing an issue i.e. when any exception occurs during load() of CacheLoader is transmitted into PortableExcpetion (POF enabled) in non-pofable mode it's transmitted as WrapperException.

    Even run time exceptions like NullPointerException and other exceptions are also transmitted as PortableException(pof enabled).

    So how can we get WrapperException in POFable mode.

    CacheLoader.java

    +public class CacheLoader extends AbstractCacheLoader {+

    +public Object load(Object key) {+
    assert key instanceof Key;
    +try {+
    return ((Key) key).retrieve();
    +} catch (CacheException ce) {+
    ce.printStackTrace();
    +// Checked Exception will be thrown from the retrieve()+
    +// These exceptions will be handled from the application+
    throw new WrapperException(ce);
    +} catch (Exception ce) {+
    ce.printStackTrace();
    throw new WrapperException(ce,
    +" Unexpected storage error while retrieving data for the Key : "+
    ++ key);+
    +}+
    +}+
    +}+

    Thanks in advance.

    DB:2.53:How Can I Get Wrapperexception From Storagenode Or Cacheloader 7s

    Hi Suhas,

    For the specific exceptions of interest you can define a custom serializer to handle each exception as a distinct pof type. By default all exceptions are handled by a single catch-all serializer.

    Thanks,

    Mark
    Oracle Coherence

  • RELEVANCY SCORE 2.53

    DB:2.53:Coherence Unknown User Type Error p3


    I am getting the following error while trying to set up coherence on a project{"message":"unknown user type: com.trgr.cobalt.webcontent.coherence.PortableProperties","stackTrace":"java.lang.IllegalArgumentException: unknown user type: com.trgr.cobalt.webcontent.coherence.PortableProperties\r\n\tat com.tangosol.io.pof.ConfigurablePofContext.getUserTypeIdentifier(ConfigurablePofContext.java:399)\r\n\tat com.tangosol.io.pof.ConfigurablePofContext.getUserTypeIdentifier(ConfigurablePofContext.java:388)\r\n\tat com.tangosol.io.pof.PofBufferWriter.writeObject(PofBufferWriter.java:1432)\r\n\tat com.tangosol.io.pof.ConfigurablePofContext.serialize(ConfigurablePofContext.java:337)\r\n\tat com.tangosol.util.ExternalizableHelper.serializeInternal(ExternalizableHelper.java:2525)\r\n\tat com.tangosol.util.ExternalizableHelper.toBinary(ExternalizableHelper.java:206)\r\n\tat com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.DistributedCache$ConverterValueToBinary.convert(DistributedCache.CDB:3)\r\n\tat com.tangosol.util.ConverterCollections$ConverterMap.put(ConverterCollections.java:1566)\r\n\tat com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.DistributedCache$ViewMap.put(DistributedCache.CDB:1)\r\n\tat com.tangosol.coherence.component.util.SafeNamedCache.put(SafeNamedCache.CDB:1)\r\n\tat ....I have the following pof-config.xml file in my projectFrom properties file:tangosol.pof.config=webcontent-pof-config.xmlXML:?xml version="1.0" encoding="UTF-8"?!DOCTYPE pof-config SYSTEM "pof-config.dtd"pof-config user-type-list includecoherence-pof-config.xml/include user-type type-id16001/type-id class-namecom.trgr.cobalt.webcontent.coherence.PortableProperties/class-name /user-type /user-type-list/pof-configPortableProperties is a pretty simple class that we use to interact with the coherence cache. It simply uses a MapString, Object, and reads/writes the contents to the Pof. Here is the class signature for referencepublic class PortableProperties extends AbstractEvolvable implements PortableObject, SerializableFrom what I can tell, that is all the configuration you need. Does anyone have any clue what I am missing to make this work? Thanks.

    DB:2.53:Coherence Unknown User Type Error p3

    It seems that your application is loading the default POF config file and not the one defined by you in you configuration, due to which it cannot find the user-type defined by you.Can you please share the logs which you get on your console while starting up the server

  • RELEVANCY SCORE 2.53

    DB:2.53:Benefits Of Pof fm


    I've seen the benefits of POF many times in the past.

    One of our teams is using big Int Arrays and asked whether POF would help. I answered "I'm sure it will".

    We've just done a test and the answer is "Not very much".

    We are storing a random Int Array size 30 x 20 (ie. 600). We are keying it by int.

    When I add 30,000 of these the benefit of switching POF off is 162G vs. 158G. (I'm doing a GC first, and then using JConsole to take this reading).

    I guess this is because Java Serialization is pretty good at storing Int Arrays (or even that, with POF turned off, it is using ExternalizableLite under the covers for Int Arrays).

    POF is also very efficient at storing small nos (eg. 1 - 256 ) and we're deliberately forcing it out of that comfort zone. (by choosing a range of 0 - 100K).

    Does that make sense to people who know these internals?

    Best, Andrew.

    ------

    public class PofSizingTest {

    public static void main(String[] args) throws Exception {

    long startTime = System.currentTimeMillis();
    int x = 20;
    int y = 30;
    int numberOfArrays = 30000;
    NamedCache namedCache = CacheFactory.getCache("POF_TEST_DOUBLE_ARR");
    for(int n = 0; n numberOfArrays ; n++) {
    double[][] profiles = new double[y][x];
    for(int i = 0; i y; i++){
    for(int j = 0; j x; j++) {
    profiles[i][j] = (Math.random() * 100000);
    }
    }
    namedCache.put(n,profiles);
    }
    System.out.println("Time Taken to put " + numberOfArrays + " values = " + (System.currentTimeMillis() - startTime));
    }
    }

    DB:2.53:Benefits Of Pof fm

    Thanks Cam and Alexey - good stuff. Interesting to understand the basic types stuff. Best, Andrew.

  • RELEVANCY SCORE 2.53

    DB:2.53:Standalone Pof Serialization 7c


    Hi,

    Is it possible to use POF Serialization outside of Coherence? We need to send cache data between our Coherence cluster and a reporting server, over JMS. We figured POF would be a good way to keep the sizes down, since all our cache objects implement the PortableObject interface. However, I can't find how we access a Serializer / Deserializer.

    Thanks
    Matt

    Edited by: 899446 on 08-Mar-2012 07:28

    DB:2.53:Standalone Pof Serialization 7c

    We played with that idea a few years back, hope my memory still right.

    You can create a ConfigurablePofContext with pof config file. Then you can use it to serialize/deserialize. Or if you want, you can even get a specific PofSerializer if you know th pof id.

    Not sure about any license issue, though.

  • RELEVANCY SCORE 2.52

    DB:2.52:Keeping Values Deserialized dj


    I gather from information about the default-serializer element that the default serialization approach is POF via the PortableObject interface.

    I understand that serialization of some sort is absolutely required for transfer of objects over the wire between grid members or extend clients.

    However, is it under configurable control to force the grid to store all its data deserialized, as Java objects? My guess is "no".

    DB:2.52:Keeping Values Deserialized dj

    rehevkor5 wrote:
    I gather from information about the default-serializer element that the default serialization approach is POF via the PortableObject interface.

    I understand that serialization of some sort is absolutely required for transfer of objects over the wire between grid members or extend clients.

    However, is it under configurable control to force the grid to store all its data deserialized, as Java objects? My guess is "no".The answer is "somewhat"...

    Replicated caches keep their data lazily deserialized. However replicated caches are usually not what you want to use.

    For distributed caches, you can store the data in binary form AND ALSO store the data in deserialized form with a significant overhead in an index (preferably non-ordered) created with the IdentityExtractor.INSTANCE, and you can get hold of the deserialized form from the forward index (QueryMap.Entry.extract(IdentityExtractor.INSTANCE) or MapIndex.get(backingMapKey). To reduce the overhead, you could create a forward-only index class and a corresponding extractor class implementing IndexAwareExtractor which does not maintain a reverse map, only a forward map which is much more efficient to maintain.

    Best regards,

    Robert