How can I use Scala's MurmurHash implementation: scala.util.MurmurHash3?

I am writing a BloomFilter and wanted to use Scala's default implementation of MurmurHash3: scala.util.MurmurHash3. My compiler fails, however, with the following compilation error:

[error] /mnt/hgfs/dr/sandbox/dr-commons/src/main/scala/dr/commons/collection/BloomFilter.scala:214: MurmurHash3 is not a member of scala.util
[error]   import scala.util.{MurmurHash3 => MH}

      

I am using Scala 2.9.1 and sbt 0.11.2.

Is the MurmurHash3 class not in the 2.9.1 library by default? I guess this is since he used a lot in the library. As far as I can see.

+3


source to share


3 answers


He named it simply scala.util.MurmurHash without 3. But it really is Murmurhash 3 algorithm (see comments in source )



EDIT I just saw that Rex Kerr is the author of scala.util.MurmurHash. I would advise you not to accept this answer (assuming it is correct); since Rex Kerr is on StackOverflow he can call back and give you a much better one ...

+4


source


The following works for me:



import scala.util.hashing.MurmurHash3

+1


source


I am using scala 2.11 and apache sparks 1.6.2. His work is wonderful. With these versions I am not getting any error

import scala.util.hashing.{ MurmurHash3 => MH3 }
    val data="I am SANTHOSH"
    val sample = MH3.stringHash(data, MH3.stringSeed)
    println(":Hash Value: "+sample)

      

<dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>1.6.2</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>1.6.2</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.11</artifactId>
            <version>1.6.2</version>
        </dependency>

      

+1


source







All Articles